this post was submitted on 08 Dec 2023
394 points (93.0% liked)

Technology

58142 readers
4266 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 40 points 9 months ago (4 children)

I use an ad blocker and haven't seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?

[–] [email protected] 37 points 9 months ago* (last edited 9 months ago) (1 children)
[–] [email protected] 7 points 9 months ago

Here is an alternative Piped link(s):

That's disgusting, where are these nude photo sites so I can avoid them? There's so MANY, but which one?!

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] [email protected] 10 points 9 months ago (1 children)

Sus question lmfao

These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you'll find them. It's a massive issue and the content is everywhere

[–] [email protected] 12 points 9 months ago (1 children)

This has been around in some form way before deepfakes

[–] [email protected] 1 points 9 months ago (1 children)

We're talking specifically about AI enhanced fakes, not the old school Photoshop fakes -- they're two completely different beasts

[–] [email protected] 10 points 9 months ago (1 children)

Different only in construction. Why they exist and what they are is older than photography.

[–] [email protected] -3 points 9 months ago (2 children)

No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing

[–] [email protected] 6 points 9 months ago (3 children)

That is a quality improvement, not a shift in nature.

[–] [email protected] 2 points 9 months ago

Or maybe an accessibility improvement. You don't need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.

[–] [email protected] 1 points 9 months ago

The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn't been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don't have an artistic bone in their body.

[–] [email protected] 1 points 9 months ago

I'm not saying that it's a shift in nature? All I've been saying is:

A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing

B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they're more convincing and therefore can have more detrimental effects

[–] [email protected] 4 points 9 months ago

There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn't something real. Makes me wonder if there were similar controversies about drawings or paintings.

[–] [email protected] 3 points 9 months ago

I don't understand either.