this post was submitted on 23 Jun 2024
279 points (92.9% liked)

Technology

34886 readers
57 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 4 months ago (4 children)

Agreed. To me, making them is one thing, it's like making a drawing at home. Is it moral? Not really. Should it be illegal? I don't think so.

Now, what this kid did, distributing them? Absolutely not okay. At that point it's not private, and you could hurt their own reputation.

This of course ignores the whole fact that she's underage, which is on its own wrong. AI generated csam is still csam.

[–] [email protected] 14 points 4 months ago (1 children)

A friend in high school made nude drawings of another mutual friend. It was weird he showed me but he was generally an artsy guy and I knew he was REALLY into this girl and it was kind of in the context of showing he his art work. I reconnected with the girl years later and talked about this and while she said it was weird she didn't really think much of it. Rather, the creepy part to her was that he showed people.

I don't think we can stop horny teens from making horny content about their classmates, heck, I know multiple girls who wrote erotic stories featuring classmates. The sharing (and realism) is what turns the creepy but kind of understandable teenage behavior into something we need to deal with

[–] [email protected] 9 points 4 months ago (1 children)

All I'm hearing is jailtime for Tina Belcher and her erotic friend fiction!

But seriously, i generally agree that as long as people aren't sharing it shouldn't be a problem. If I can picture it in my head without consequence, seems kinda silly putting that thought on paper/screen should be illegal.

[–] [email protected] 8 points 4 months ago (1 children)

Exactly, and it begs the question too, where's the line? If you draw a stick figure of your crush with boobs is that a crime? Is it when you put an arrow and write her name next to it? AI just makes that more realistic, but it's the same basic premise.

Distributing it is where it crosses a hard line and becomes something that should not be encouraged.

[–] [email protected] 0 points 4 months ago* (last edited 4 months ago)

It's not some slippery slope to prohibit people generating sexual imagary of real people without their consent. The fuck is wrong with AI supporters?

Even if you're a "horny teenager" making fake porn of someone is fucking weird and not normal or healthy.

[–] [email protected] 10 points 4 months ago* (last edited 4 months ago) (1 children)

AI generated csam is still csam.

Idk, with real people the determination on if someone is underage is based on their age and not their physical appearance. There are people who look unnaturally young that could legally do porn, and underage people who look much older but aren't allowed. It's not about their appearance, but how old they are.

With drawn or AI-generated CSAM, how would you draw that line of what's fine and what's a major crime with lifelong repercussions? There's not an actual age to use, the images aren't real, so how do you determine the legal age? Do you do a physical developmental point scale and pick a value that's developed enough? Do you have a committee where they just say "yeah, looks kinda young to me" and convict someone for child pornography?

To be clear I'm not trying to defend these people, but it seems like trying to determine what counts legal/non-legal for fake images seems like a legal nightmare. I'm sure there are cases where this would be more clear cut (if they ai generate with a specific age, trying to do deep fakes of a specific person, etc), but a lot of it seems really murky when you try to imagine how to actually prosecute over it.

[–] [email protected] 6 points 4 months ago

all good points, and I'll for sure say that I'm not qualified enough to be able to answer that. I also don't think politicians or moms groups or anyone are.

All I'll do is muddy the waters more. We as the vast majority of humanity think CSAM is sick, and those who consume it are not healthy. I've read that psychologists are split. Some think AI generated CSAM is bad, illegal, and only makes those who consume it worse. Others, however, suggest that it may actually curb urges, and ask why not let them generate it, it might actually reduce real children from being actually harmed.

I personally have no idea, and again am not qualified to answer those questions, but goddamn did AI really just barge in without us being ready for it. Fucking big tech again. "I'm sure society will figure it out"

[–] [email protected] 2 points 4 months ago

Reputation matters less than harassment. If these people were describing her body publicly it would be a similar attack.