this post was submitted on 25 Oct 2023
81 points (81.9% liked)

Technology

58172 readers
3065 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 87 points 11 months ago* (last edited 11 months ago) (17 children)

Drawn art depicting minors in sexual situations has been deemed protected as free speech in the US. It's why, at least in the US, you don't have to worry about the anime girl that's 17 landing you in prison on child porn charges. The reasoning: there is no victim, the anime girl is not sentient, therefore the creation of that art is protected as free speech.

I suspect a similar thing will happen with this. As long as it is not depicting a real person, the completely invented person is not sentient, there is no victim, this will fall under free speech. At least in the US.

However, it is likely a very, very bad idea to have any photo-realistic art of this manner, as it may not be clear to authorities if it is from AI or if there is in fact a person you are victimizing. Doubly so if you download this from someone else, as you don't know if that is a real person either.

[–] [email protected] 61 points 11 months ago* (last edited 11 months ago) (8 children)

Deepfakes of an actual child should be considered defamatory use of a person's image; but they aren't evidence of actual abuse the way real CSAM is.

Remember, the original point of the term "child sexual abuse material" was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse -- such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

[–] [email protected] 3 points 11 months ago (3 children)

Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won't need "real"

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

[–] [email protected] 9 points 11 months ago

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting "artificial CSAM".

[–] [email protected] 4 points 11 months ago

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

Have you got some source about this ?

[–] [email protected] 2 points 11 months ago (1 children)

Some actually fetishize causing suffering.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

Some people are sadists and rapists, yes, regardless of what age group they'd want to do it with.

load more comments (4 replies)
load more comments (12 replies)