this post was submitted on 22 Mar 2024
497 points (93.8% liked)

Technology

59322 readers
4428 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 68 points 7 months ago (1 children)

And trust me, these generated images are getting scarily good.

I have to agree, I would not be able to spot a single one of them as fake. They look really convincingly authentic IMO.

[–] [email protected] 143 points 7 months ago (7 children)

Stalin famously ordered people he had killed erased from photos.

Imagine what current and future autocratic regimes will be able to achieve when they want to rewrite their histories.

[–] [email protected] 50 points 7 months ago* (last edited 7 months ago) (4 children)

Stalin famously ordered people he had killed erased from photos.

This checks out, here's an article about it: https://www.history.com/news/josef-stalin-great-purge-photo-retouching

So why are you downvoted? Maybe because your view is too optimistic? And the problem isn't only with autocratic regimes. But much more general.
How do we validate anything, when everything can be easily faked?

[–] [email protected] 24 points 7 months ago (1 children)

Probably just because some people really like Stalin, and have become convinced his accounts are the truthful ones and everyone else lies about him.

[–] [email protected] 5 points 7 months ago

That's a scary thought!! But all kinds of crazy exist, and I mean people have to be literally crazy to want to live under a regime like Stalin made.

[–] [email protected] 16 points 7 months ago

So why are you downvoted?

lemmygrad dot ml

[–] [email protected] 9 points 7 months ago (2 children)

With AI video also getting increasingly impressive and believable, I worry that we will soon live in a world where you could have actual video evidence of a murder, and that evidence being dismissed or cast into doubt because of how easy, or supposedly how easy it would be to fake.

[–] [email protected] 3 points 7 months ago

Absolutely, only video from trusted sources can be used. But isn't that already the case?

[–] [email protected] 1 points 7 months ago (1 children)

Better than having people get convicted based on fake evidence, though.

[–] [email protected] 2 points 7 months ago

I think they are both equally scary. I'm imagining cases where photo and video evidence have played major roles in proving police abuses of power for example. We will certainly have an onslaught of people making faking evidence of all sorts of things to push a political narrative, but equally in any politicized narrative, any politically inconvenient photos or videos of real things that really happened might be swept under the rug as "someone probably just faked that for political gain." Sure you could have an investigation to look into the authenticity of the evidence, or look at other forensic evidence, but probably only if you can afford to have such an investigation done, or enough public attention gets drawn to it. I fear we are reaching a scary time where, in a sense, reality will be whatever people want it to be, and we will increasingly be unable to trust anything we see as real with absolute certainty. We have been headed down this road for a very long time, but this will just make it much worse

[–] [email protected] 8 points 7 months ago (3 children)

“Photoshopping” something bad existed for a long time at this point.AI generated images doesn’t really change anything other then the entire photo being fake instead of just a small section.

[–] [email protected] 19 points 7 months ago

I’d disagree. It takes, now, zero know-how to convincingly create a false image. And it takes zero work. So where one photo would take one person a decent amount of time to convincingly pull off, now one person can create 100 images or more in that time, each one a potential time bomb that will go off when it starts getting passed around as evidence of something. And there are uncountable numbers of bad actors on the internet trying to cause a ruckus. This just increased their chances of succeeding at least 100-fold, and opened the access to many, many others who might just do it accidentally, for a joke, or who always wanted to create waves but didn’t have the photoshop skills necessary.

[–] [email protected] 11 points 7 months ago (1 children)

It changes a lot. Good Photoshopping skills would not create the images as shown in the article.

[–] [email protected] 4 points 7 months ago

Yeah some of these would be like 100 layer creations if someone was doing it themselves in photoshop -- It would take a professional or near-professional level of skills.

[–] [email protected] 4 points 7 months ago

The easy and speed with which AI created photos, of a quality most photoshoppers could only dream, can be created of does very much change everything.

[–] [email protected] 11 points 7 months ago* (last edited 7 months ago)

Digital image editing has been really good for this kind of stuff for quite a while. Now it’s even easier with content aware fill.

Unless you’re the PR manager for the British Royal family. Then you somehow lack the basic skills to make convincing edits.

[–] [email protected] 5 points 7 months ago
[–] [email protected] 3 points 7 months ago (1 children)

I can Imagine such regimes novadays to develop some sort of cryptographic photo attestation, so any photo not signed by them is going to be shown as untrusted, regardless if it's fake or not. And all the code from processor to camera app would need to be approved by their servers in order to get a sign.

Oh wait! Our great friends at Adobe, Intel, Google and Microsoft are already working on just that: https://c2pa.org/

[–] [email protected] 3 points 7 months ago (1 children)

It won't help for uncles on Facebook spreading lies.

[–] [email protected] 3 points 7 months ago

It would not help with anything.

[–] [email protected] 3 points 7 months ago* (last edited 7 months ago)

Honestly, it looks like the picture on the left is fake, like the guy was inserted into it. Just look at his outline, compared with the rest of the background.

(I'm no Stalin fan, just commenting on the picture itself.)