this post was submitted on 09 Oct 2024
430 points (97.8% liked)

Technology

59378 readers
3617 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/20664372

X (formerly Twitter) claims that non-consensual nudity is not tolerated on its platform. But a recent study shows that X is more likely to quickly remove this harmful content—sometimes known as revenge porn or non-consensual intimate imagery (NCII)—if victims flag content through a Digital Millennium Copyright Act (DMCA) takedown rather than using X's mechanism for reporting NCII.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 124 points 1 month ago* (last edited 1 month ago) (2 children)

In other words, the consent of a corporation is more important than the consent of a human being... for the public distribution of that human being's likeness in an intimate context. Holy dystopia, conservatives are fucked in the head.

[–] [email protected] 8 points 1 month ago

The likelihood of being sued is the issue. The DMCA has tons of case law behind it making it a powerful tool for the media companies. Not so much for NCII. Also, individuals usually don't have legal teams.

So as with much fuckery in the modern world, we can use the rules as well as they can. If someone's image is shared as NCII, they only need flag it as a DMCA violation. Problem being, who is going to know that trick?