this post was submitted on 23 Sep 2023
337 points (92.9% liked)
Technology
59424 readers
3747 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The bit about "ignoring it" was more in jest. We do review each report and handle it in a case by case basis, my point with this statement is that someone hosting questionable content is going to generate alot of reports, regardless of whether it is illegal or not, and we won't take an operating loss and let them keep hosting with us.
Usually we try and determine if it was intentional or not, if someone is hosting CSAM and is quick and responsive with resolving the issue, we generally won't immediately terminate them for it. But even if they (our client) is a victim, we are not required to host for them and after a certain point we will terminate them.
So when we receive a complaint about a user hosting CSAM, we review it and see they are hosting a site advertising itself as intended to allow users to distribute AI generated CP, we aren't going to let him continue hosting with us.
This is not an accurate statement, at least in the U.S. where we are based. We are not (yet) required to sift through any and all content uploaded on our servers (not to mention the complexity of such an undertaking making it virtually impossible at our level). There have been a few laws proposed that would have changed that, as we've seen in the news from time to time. We are required to handle reports we receive about our clients.
Keep in mind when I say we are a hosting provider, I'm referring to pretty high up the chain - we provide hosting to clients that would say, host a Lemmy instance, or a Discord bot, or a personal NextCloud server, to name a few examples. A common dynamic is how much abuse is your hosting provider willing to put up with, and if you recall with the CSAM attacks on Lemmy instances part of the discussion was risking getting their servers shutdown.
Which is valid, hosting providers will only put up with so much risk to their infrastructure, reputation, and / or staff. Which is why people who run sites like Lemmy or image hosting services do usually want to take an active role in preventing abuse - whether or not they are legally liable won't matter when we pull the plug because they are causing us an operating loss.
I'm just going to reply to the rest of your statement down here, I think I did not make my intent/purpose clear enough. I originally replied to your statement talking about AI being used to make CP in the future by providing a personal anecdote about it already happening. To which you asked a question as to why I defined AI generated CP as CSAM, and I clarified. I wasn't actually responding to the rest of that message. I was not touching the topic or discussion of what impact it might have on the actual abuse of children, merely providing my opinion as to why, whether legal or not, hosting providers aren't ever going to host that content.
The content will be hosted either way, but whether it is merely relegated to "offshore" providers but still accessible via normal means and not criminal content, or becomes another part of the dark web, will be determine at some point in the future. It hasn't become a huge issue yet but it is rapidly approaching that point.