this post was submitted on 29 Mar 2024
341 points (93.4% liked)

Technology

58142 readers
4326 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 42 points 5 months ago (3 children)

Doesn't mean distribution should be legal.

People are going to do what they're going to do, and the existence of this isn't an argument to put spyware on everyone's computer to catch it or whatever crazy extreme you can take it to.

But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

[–] [email protected] 16 points 5 months ago

Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

[–] [email protected] 6 points 5 months ago

While I agree in spirit, any law surrounding it would need to be very clearly worded, with certain exceptions carved out. Which I'm sure wouldn't happen.

I could easily see people thinking something was of them, when in reality it was of someone else.

[–] [email protected] 3 points 5 months ago (1 children)

I'm not familiar with the US laws, but… isn't it already some form of crime or something to distribute nude of someone without their consent? This should not change whether AI is involved or not.

[–] [email protected] 4 points 5 months ago* (last edited 5 months ago)

It might depend on whether fabricating them wholesale would be considered a nude or not. Legally, it could be considered a different person if you're making it, since the "nude" is someone else, and you're putting their face on top, or it's a complete fabrication made by a computer.

Unclear if it would still count if it was someone else and they were lying about it being the victim, for example, pretending a headless mirror-nude was sent by the victim, when it was sent by someone else.