this post was submitted on 17 Nov 2023
593 points (95.1% liked)

Technology

59424 readers
2855 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

"If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible."

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

You'd be surprised by how much of the Internet was built by furries, BDSM folk, and other people whose porn a lot of folks think is weird and icky.

Also, you seem to have misunderstood the gist of my comment, or I wasn't clear enough. The tools to deal with CSAM will of necessity be a lot stronger than content moderation that's driven by users' preferences of what they'd like not to see.

[–] [email protected] -3 points 1 year ago* (last edited 1 year ago) (1 children)

The issue is your categorization, and either rhe rhought, or lack of thought, that went into making them: "real csam", and "the icky stuff"

When you categorized the first as "real" it leaves a gap for the rest of "fake" and "implied" CSAM, which me, the reader, is left assuming goes in your other category, especially since your other category has no specifics, and we all know what CSAM is.That was the logic behind my comment:

"If somebody is tiptoeing around abusive material it's because they want to view abusive material."

Also I find it suspect that you've characterized the issue with CSAM material being that you can get in trouble for owning it, not that it wrecks somebody's fucking life to make...

Honestly I think you would be better off deleting your comment completely. White knighting the term "questionable pictures" in a public forum isn't a good look regardless of what you meant.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

I'm talking about the necessities of moderation policy.

The things you think it's "suspect" I'm not saying? Those are things I think are obviously true and don't need to be restated. Yes, child abuse is very bad. We know that. I don't need to say it over again, because everyone already knows it. I'm talking specifically about the needs for moderation here.

I'm pointing at the necessary distinction between "you personally morally object to that material" and "that material will cause the law to come down on you and your users and anyone who peers with you".

You should have the ability to keep both of those off your server, but the latter is way more critical.


"White knighting"? Delete your account.

[–] [email protected] -1 points 1 year ago

"source provided" lmao