this post was submitted on 26 Feb 2025
754 points (99.0% liked)

Technology

63313 readers
4160 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Update: After this article was published, Bluesky restored Kabas' post and told 404 Media the following: "This was a case of our moderators applying the policy for non-consensual AI content strictly. After re-evaluating the newsworthy context, the moderation team is reinstating those posts."

Bluesky deleted a viral, AI-generated protest video in which Donald Trump is sucking on Elon Musk’s toes because its moderators said it was “non-consensual explicit material.” The video was broadcast on televisions inside the office Housing and Urban Development earlier this week, and quickly went viral on Bluesky and Twitter.

Independent journalist Marisa Kabas obtained a video from a government employee and posted it on Bluesky, where it went viral. Tuesday night, Bluesky moderators deleted the video because they said it was “non-consensual explicit material.”

Other Bluesky users said that versions of the video they uploaded were also deleted, though it is still possible to find the video on the platform.

Technically speaking, the AI video of Trump sucking Musk’s toes, which had the words “LONG LIVE THE REAL KING” shown on top of it, is a nonconsensual AI-generated video, because Trump and Musk did not agree to it. But social media platform content moderation policies have always had carve outs that allow for the criticism of powerful people, especially the world’s richest man and the literal president of the United States.

For example, we once obtained Facebook’s internal rules about sexual content for content moderators, which included broad carveouts to allow for sexual content that criticized public figures and politicians. The First Amendment, which does not apply to social media companies but is relevant considering that Bluesky told Kabas she could not use the platform to “break the law,” has essentially unlimited protection for criticizing public figures in the way this video is doing.

Content moderation has been one of Bluesky’s growing pains over the last few months. The platform has millions of users but only a few dozen employees, meaning that perfect content moderation is impossible, and a lot of it necessarily needs to be automated. This is going to lead to mistakes. But the video Kabas posted was one of the most popular posts on the platform earlier this week and resulted in a national conversation about the protest. Deleting it—whether accidentally or because its moderation rules are so strict as to not allow for this type of reporting on a protest against the President of the United States—is a problem.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 58 minutes ago

Correct. this is indeed the correct decision to remove the thing. BUT i have a feeling that this quick reaction does not compare to the speed of decision for normal people, especially women who get this kind of stuff made about them.

Also, note that I'm not saying it was bad to make the video, or have it run in public on hacked screens.
That is perfectly fine political commentary, by means of civil disobedience.

Just that Bluesky is correct in it's action to remove it from their service.

[–] [email protected] 30 points 2 hours ago (6 children)

I seem to be in the minority here, but I am extremely uncomfortable the idea of non-consensual AI porn of anyone. Even people I despise. It’s so unethical that it just disgusts me. I understand why there are exceptions for those in positions of power, but I’d be more than happy to live in a world where there weren’t.

[–] [email protected] 2 points 51 minutes ago

I agree. I've thought about it a lot and I still don't have any sympathy for them after the harm they've caused. I see why it's news worthy enough they might reverse it, and why it would be political speech.

But also I think they made the right choice to take it down. If blsky wants to be the better platform, it needs to be better. And not having an exception for this is the right thing.

[–] [email protected] 6 points 1 hour ago (1 children)

I agree with you.

However...there's an argument to be made that the post itself is a form of criticism and falls under the free speech rules where it regards political figures. In many ways, it's not any different than the drawings of Musk holding Trump's puppet strings, or Putin and Trump riding a horse together. One is drawn and the other is animated, but they're the same basic concept.

I understand however that that sets a disturbing precedent for what can and cannot be acceptable. But I don't know where to draw that line. I just know that it has to be drawn somewhere.

I think...and this is my opinion...political figures are fair game for this, while there should be protections in place for private citizens, since political figures by their very ambition put themselves in the public sphere whereas private individuals do not.

[–] [email protected] 2 points 48 minutes ago

In my opinion, public figures, including celebrities, give a degree of consent implicitly by seeking to be public figures. I dont think that for celebrities that should extend to lewd or objectionable material, but if your behavior has been to seek being a public figure you can't be upset when people use your likeness in various ways.

For politicians, I would default to "literally everything is protected free speech", with exceptions relating to things that are definitively false, damaging and unrelated to their public work.
"I have a picture of Elon musk engaging in pedophillia" is all those, and would be justifiably removed. Anything short of that though should be permitted.

[–] [email protected] 5 points 2 hours ago (2 children)

Where do you draw the line for the rich fucks of the world? Realistic CGI? Realistic drawings? Edited photos?

[–] [email protected] 3 points 1 hour ago

This is what I was thinking about myself. Because we're cool with political caricatures, right?

I guess the problem is that nobody wants to feature in non-consensual AI porn. I mean if you'd want to draw me getting shafted by Musk, that'd be weird, but a highly realistic video of the same event, that would be hard to explain to the missus.

[–] [email protected] 3 points 1 hour ago

Assuming you’re asking out of genuine curiosity, for me personally, I’d draw the line somewhere along “could this, or any frame of this, be mistaken for a real depiction of these people?” and “if this were a depiction of real children, how hard would the FBI come down on you?”

I understand that that’s not a practical way of creating law or moderating content, but I don’t care because I’m talking about my personal preference/comfort level. Not what I think should be policy. And frankly, I don’t know what should be policy or how to word it all in anti-loopholes lawyer-speak. I just know that this sucking toes thing crosses an ethical line for me and personally I hate it.

Putting it more idealistically: when I imagine living in utopia, non-consensual AI porn of people doesn’t exist in it. So in an effort to get closer to utopia, I disapprove of things that would not exist in an utopia.

[–] [email protected] 1 points 1 hour ago

In my country the laws about publishing photos etc are different for anyone an "people of public interest". So yeah imo it should be okay to create cartoons or whatever of politicians without their permission - not porn ofc. Including ai generated stuff, but that one should be marked as such , given how realistic it is now

[–] [email protected] 0 points 51 minutes ago* (last edited 50 minutes ago) (1 children)

In this case, it's clearly a form of speech and therefore protected under the 1st amendment.

I also don't understand such a strong reaction to non-consensual AI porn. I mean, I don't think it's in good taste but I also don't see why it warrants such a strong reaction. It's not real. If I draw a stick figure with boobs and I put your name on it, do you believe I am committing a crime?

[–] [email protected] 2 points 42 minutes ago

Protected from government censorship. Companies have strong protections allowing for controlling the speech on their platforms.

And if you asked Roberts he'd probably say since companies are people, as long as it's used to protect conservatives they have protection for controlling their platforms speech as a 1st amendment right.

[–] [email protected] 0 points 2 hours ago (1 children)

Anything bad that happens to a conservative is good. The world will only get better if they are made to repeatedly suffer.

[–] [email protected] 3 points 1 hour ago (2 children)

No, we cannot think like that. It is true that fascism cannot be beat peacefully, but we should never want them to suffer. We should always strive to crush their fascist oligarchy with as little suffering ss possible.

"Whoever would be a slayer of monsters must take heed, or they may become the very monsters they slay... For when one peers into the abyss, the abyss peers back into thee" -FN

[–] [email protected] 1 points 54 minutes ago

It is true that fascism cannot be beat peacefully, but we should never want them to suffer

This is true. We should rapidly give them a lead injection, rather than have them suffer.

[–] [email protected] 2 points 1 hour ago

They don't believe anything they aren't experiencing first hand is actually a problem.

As much as I don't like it, they have clearly made their own personal suffering a prerequisite for any solutions being allowed to move forward

[–] [email protected] 6 points 6 hours ago (1 children)

Bluesky will become just the same az elonx...

[–] [email protected] -2 points 2 hours ago

It already is

[–] [email protected] 20 points 10 hours ago* (last edited 10 hours ago) (8 children)

Ah, the rewards of moderation: the best move is not to play. Fuck it is & has always been a better answer. Anarchy of the early internet was better than letting some paternalistic authority decide the right images & words to allow us to see, and decentralization isn't a bad idea.

Yet the forward-thinking people of today know better and insist that with their brave, new moderation they'll paternalize better without stopping to acknowledge how horribly broken, arbitrary, & fallible that entire approach is. Instead of learning what we already knew, social media keeps repeating the same dumb mistakes, and people clamor to the newest iteration of it.

[–] [email protected] 5 points 3 hours ago (2 children)

You clearly never were the victim back in those days. Neither do you realize this approach doesn't work on the modern web even in the slightest, unless you want the basics of both enlightenment and therefore science and democracy crumbling down even faster.

Anarchism is never an answer, it's usually willful ignorance about there being any problems.

[–] [email protected] 3 points 1 hour ago (2 children)

Anarchism is never an answer, it's usually willful ignorance about there being any problems.

AnCaps drive me nuts. They want to dismantle democratic institutions while simultaneously licking the boots of unelected institutions.

[–] [email protected] 1 points 24 minutes ago

People against ancaps usually only disagree with them in the way institutions are being dismantled.

In any case looking through the eyes of an ancap you might get valuable insights, and this thought should be obvious for an intelligent person of any school in regards to any other.

[–] [email protected] 3 points 58 minutes ago (1 children)

I guess I don't really consider AnCaps to be Anarchists because Anarchy is generally leftist philosophy. Traditional anarchy is like small government socialism: empowered local unions and city governments.

[–] [email protected] 1 points 18 minutes ago

You know what's funny about Stalinism that everyone forgets about?

Its structures were similar to what you describe on the lower level. Districts and factories and such all had their councils (soviet means council), from which representatives were elected to councils of the upper level. They still were pretty despotic most of that period, because crowd rule leads to despotism.

Democracy shouldn't be made too small and too unavoidable. In some sense an imagined hillbilly village is democratic with that problem.

Point being that this didn't look much like some people imagine anarchy.

Anyway, ancaps are not particularly attached to the name, and themselves prefer the words "voluntarism" and "agorism" and a few others. But it's one of the most common names for the ideology.

[–] [email protected] 1 points 52 minutes ago (1 children)

Anarchism is never an answer

This isn't anarchism, as described. Anarchism, like actual anarchism, is the only likely solution, imo. No gods, no masters, no idols.

[–] [email protected] 1 points 17 minutes ago

Solution involves answers where to get energy to dig in the gods, masters and idols. They are well-armed and those seeking solutions are not.

[–] [email protected] 2 points 3 hours ago

I think there's a huge difference between fighting bullying or hate speech against minorities. Another thing is making fun of very specific and very public people.

[–] [email protected] 3 points 4 hours ago

Elon acts like a new Reddit mod drunk on power. He is the guy screaming in the comments that he knows how to run a forum better and seized the chance, and now he cannot fathom why people hate him.

[–] [email protected] 11 points 6 hours ago* (last edited 6 hours ago)

You need some kind of moderation for user generated content, even if it’s only to comply with takedowns related to law (and I’m not talking about DMCA).

[–] [email protected] 8 points 6 hours ago

Fuck it is & has always been a better answer

Sure. Unless you live in a place that have laws and laws enforcement. In that case, it's "fuck it and get burnt down".

[–] [email protected] 7 points 6 hours ago (1 children)

You do remember snuff and goatse and csam of the early internet, I hope.

Even with that of course it was better, because that stuff still floats around, and small groups of enjoyers easily find ways to share it over mainstream platforms.

I'm not even talking about big groups of enjoyers, ISIS (rebranded sometimes), Turkey, Azerbaijan, Israel, Myanma's regime, cartels and everyone share what they want of snuff genre, and it holds long enough.

In text communication their points of view are also less likely to be banned or suppressed than mine.

So yes.

Yet the forward-thinking people of today know better and insist that with their brave, new moderation they’ll paternalize better

They don't think so, just use the opportunity to do this stuff in area where immunity against it is not yet established.

There are very few stupid people in positions of power, competition is a bitch.

[–] [email protected] 4 points 3 hours ago* (last edited 3 hours ago) (1 children)

I'm weirded out when people say they want zero moderation. I really don't want to see any more beheading or CSAM and moderation can prevent that.

[–] [email protected] -3 points 3 hours ago (1 children)

Moderation should be optional .

Say, a message may have any amount of "moderating authority" verdicts, where a user might set up whether they see only messages vetted by authority A, only by authority B, only by A logical-or B, or all messages not blacklisted by authority A, and plenty of other variants, say, we trust authority C unless authority F thinks otherwise, because we trust authority F to know things C is trying to reduce in visibility.

Filtering and censorship are two different tasks. We don't need censorship to avoid seeing CSAM. Filtering is enough.

This fallacy is very easy to encounter, people justify by their unwillingness to encounter something the need to censor it for everyone as if that were not solvable. They also refuse to see that's technically solvable. Such a "verdict" from moderation authority, by the way, is as hard to do as an upvote or a downvote.

For a human or even a group of humans it's hard to pre-moderate every post in a period of time, but that's solvable too - by putting, yes, an AI classifier before humans and making humans check only uncertain cases (or certain ones someone complained about, or certain ones another good moderation authority flagged the opposite, you get the idea).

I like that subject, I think it's very important for the Web to have a good future.

[–] [email protected] 4 points 2 hours ago (1 children)

people justify by their unwillingness to encounter something the need to censor it for everyone...

I can't engage in good faith with someone who says this about CSAM.

Filtering and censorship are two different tasks. We don’t need censorship to avoid seeing CSAM. Filtering is enough.

No it is not. People are not tagging their shit properly when it is illegal.

[–] [email protected] -1 points 2 hours ago

I can't engage in good faith

Right, you can't.

If someone posts CSAM, police should get their butts to that someone's place.

No it is not. People are not tagging their shit properly when it is illegal.

What I described doesn't have anything to do with people tagging what they post. It's about users choosing the logic of interpreting moderation decisions. But I've described it very clearly in the previous comment, so please read it or leave the thread.

[–] [email protected] 4 points 6 hours ago (1 children)

I miss the early days of the internet when it was still a wild west.

Something like I hate you myg0t 2 or Pico's School would have gotten the creators cancelled if released in 2025.

[–] [email protected] 4 points 3 hours ago

Note on the term canceling. Independent creators cannot, by definition, get canceled. Unless you literally are under a production or publishing contract that gets actually canceled due to something you said or did, you were not canceled. Being unpopular is not getting canceled, neither is receiving public outrage due to being bad or unpopular. Even in a figurative sense, just the fact that the videos were published to YouTube and can still be viewed means they were not canceled. They just fell out of the zeitgeist and aren't popular anymore, that happens to 99% of entertainment content.

[–] [email protected] 17 points 9 hours ago (1 children)

I had to hack an ex’s account once to get the revenge porn they posted of me taken down.

There’s a balance at the end of the day.

load more comments (1 replies)
load more comments
view more: next ›