this post was submitted on 21 May 2024
510 points (95.4% liked)

Technology

59378 readers
4188 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 5 months ago (1 children)

Sure, but isn't the the perpetrator the company that trained the model without their permission? If a doctor saves someone's life using knowledge based on nazi medical experiments, then surely the doctor isn't responsible for the crimes?

[–] [email protected] -3 points 5 months ago (1 children)

So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

Your analogy doesn't match the premise. (Again assuming there is no csam in the training data which is unlikely) the training data is not the problem it is how the data is used. Using those same picture to generate photos of medieval kids eating ice cream with their family is fine. Using it to make CSAM is not.

It would be more like the doctor using the nazi experiments to do some other fucked up experiments.

(Also you posted your response like 5 times)

[–] [email protected] 1 points 5 months ago (1 children)

Sorry, my app glitched out and posted my comment multiple times, and got me banned for spamming... Now that I got unbanned I can reply.

So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

In this scenario no, because the crime was in how someone used the car, not in the creation of the car. The guy in this story did commit a crime, but for other reasons. I'm just saying that if you are claiming that children in the training data are victims of some crime, then that crime was committed when training the model. They obviously didn't agree for their photos to be used that way, and most likely didn't agree for their photos to be used for AI training at all. So by the time this guy came around, they were already victims, and would still be victims if he didn't.

[–] [email protected] 1 points 5 months ago

I would argue that the person using the model for that purpose is further victimizing the children. Kinda like how with revenge porn the worst perpetrator is the person who uploaded the content, but every person viewing it from there is furthering the victimization. It is mentally damaging for the victim of revenge porn to know that their intimate videos are being seen/sought out.