this post was submitted on 25 Oct 2023
127 points (100.0% liked)

Technology

37604 readers
142 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 10 months ago* (last edited 10 months ago)

Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

While I absolutely don't want to sound like I'm defending the practice (because I'm not), I'm really not too sure of this. If this was true, would similar logic apply to other AI-generated depictions of illegal or morally reprehensible situations? Do photorealistic depictions of murder make it more likely that the people going out of their way to generate or find those pictures will murder someone or seek out pictures of real murder? Will depictions of rape lead to actual rape? If the answer to those or other similar questions is "no", then why is child porn different? If "yes", then should we declare all the other ones illegal as well?

It's not that I think AI-generated child porn should be accepted or let alone encouraged by any means, but as was pointed out it might actually even be counterproductive to ruin someone's life over AI-generated material in which there is factually no victim, as reprehensible as the material may be; just because something is disgusting to most of us doesn't mean it's a very good justification for making it illegal if there is no victim.

The reason why I'm not convinced of the argument is that a similar one has been used when eg. arguing for censorship of video games, with the claim that playing "murder simulators" which can look relatively realistic will make people (usually children) more likely to commit violent acts, and according to research that isn't the case.

I'd even be inclined to argue that being able to generate AI images of sexualized minors might even make it less likely for the person to move over to eg. searching for actual child porn or committing abuse as it's a relatively easier and safer way for them to satisfy an urge. I wouldn't be willing to bet on that though