this post was submitted on 21 May 2024
510 points (95.4% liked)
Technology
59378 readers
4249 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Training is how it knows it...
You can ask it to make an image of a man made of pizza. That doesn't mean it was trained on images of that.
But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn't in a sexual context.
Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn't trained on. Naked + child is just a simple equation for it to solve
You can always tell when someone has no clue about AI but has read online about it.
The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They're also really good at editing images to add or remove entire objects.
I think @[email protected] meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.
But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.