this post was submitted on 01 Sep 2023
919 points (96.1% liked)

Memes

45619 readers
1001 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 126 points 1 year ago* (last edited 1 year ago) (14 children)

On feddit.de, lemmy.world is only temporarily defederated because of CSAM until a patch is merged into Lemmy that prevents images from being downloaded to your own instance.

So I'll just be patient and wait. It's understandable the admins don't want to get problems with law enforcement.

[–] [email protected] 22 points 1 year ago (8 children)

Won't that lead to some horrible hug-of-death type scenarios if a post from a small instance gets popular on a huge one?

[–] [email protected] 11 points 1 year ago (1 children)

We need more decentralization, a federated image/gif host with CSAM protections

[–] [email protected] 2 points 1 year ago (1 children)

How would one realize CSAM protection? You'd need actual ML to check for it, and I do not think there are trained models available. And now find someone that wants to train such a model, somehow. Also, running an ML model would be quite expensive in energy and hardware.

[–] [email protected] 2 points 1 year ago

There are models for detecting adult material, idk how well they’d work on CSAM though. Additionally, there exists a hash identification system for known images, idk if it’s available to the public, but I know apple has it.

Idk, but we gotta figure out something

load more comments (6 replies)
load more comments (11 replies)