this post was submitted on 23 Jan 2025
989 points (97.1% liked)

Technology

60942 readers
4816 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 9 hours ago

If the channel is popular, those videos will get recommend

Of it has engagement on top of that, you are fucked, it will definitely get recommend to you.

Either block the channel, the user, or use in incognito. Or don't

[–] [email protected] 9 points 13 hours ago (1 children)

Yeah, I've gotten more right wing video recommendations on YouTube, even though I have turned off my history. And even if I turned on my history, I typically watch left wing videos.

[–] [email protected] 5 points 12 hours ago

The minute I sign out of my account YouTube tries to radicalize me.

[–] [email protected] 4 points 11 hours ago (4 children)

fresh YouTube account

change their location to a random city in the US

yeah but you're still bound to IP addresses. I was under the impression Youtube used those for their profiling

load more comments (4 replies)
[–] [email protected] 14 points 16 hours ago

I bet thise right wing shorts are proposed and shoehorned in everywhere because someone pays for the visibility. Simple as that.

[–] [email protected] 1 points 9 hours ago

I use YouTube and don't get much far-right content. My guess is it's because I don't watch much political content. I use a podcatcher and websites for that. If I watched political content, it might show me some lurid videos promoting politics I disagree with because that tends to keep viewers engaged with the site/app longer than if they just showed videos consistent with the ideology I seek out. That gives people the feeling they're trying to push an ideology.

I made that up without any evidence. It's just my guess. I'm a moderate libertarian who leans Democratic because Republicans have not even been pretending to care about liberty, and for whatever reason it doesn't recommend the far-right crap to me.

[–] [email protected] 21 points 18 hours ago* (last edited 18 hours ago) (2 children)

The view farming in shorts makes it even harder to avoid as well. Sure, I can block the JRE channel, for example, but that doesn’t stop me from getting JRE clips from probably day-old accounts which just have some shitty music thrown on top. If you can somehow block those channels, there’s new ones the next day, ad infinitum.

It’s too bad you can’t just disable the tab entirely, I feel like I get sucked in more than I should. I’ve tried browser extensions on mobile which remove the tab, but I haven’t had much luck with PiPing videos from the mobile website, so I can’t fully stop the app.

[–] [email protected] 7 points 17 hours ago (1 children)
[–] [email protected] 1 points 9 hours ago

iOS, unfortunately, but thank you!

I think there are similar options, but I’m pretty burned out on side loading at this point after doing it with emulators for years

load more comments (1 replies)
[–] [email protected] 1 points 9 hours ago

I've been happy with BlockTube for blocking channels or single videos. I also use YouTube Shorts Redirect for automatically converting shorts into regular videos.

[–] [email protected] 186 points 1 day ago (8 children)

I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.

[–] [email protected] 4 points 14 hours ago* (last edited 14 hours ago) (1 children)

yeah i created a new youtube account in a container once and just watched all the popular/drama suggestions. that account turned into a shitstorm immediately

these days i curate my youtube accounts making liberal use of Not interested/Do not recommend channel/Editing my history and even test watching in a container before watching it on my curated account

this is just how "the algorithm" works. shovel more of what you watch in your face.

the fact that they initially will give you right-wing, conspiracy fueled, populist, trash right off the bat is the concern

[–] [email protected] 3 points 11 hours ago (1 children)

Man that seems like a lot of work just to preserve a shitty logarithm that clearly isn't working for you... Just get a third party app and watch without logging in

[–] [email protected] 1 points 9 hours ago

oddly enough it seems to be working, if i don't login at all youtube just offers up the usual dross

[–] [email protected] 1 points 10 hours ago (2 children)

Isn't the simpler explanation is youtube has and always will promote the alt-right? Also, no longer the alt right, it's just the right.

load more comments (2 replies)
[–] [email protected] 59 points 1 day ago* (last edited 1 day ago) (2 children)

I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.

Been considering creating content myself to at least stem the tide a little.

[–] [email protected] 37 points 23 hours ago (3 children)

I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.

[–] [email protected] 4 points 11 hours ago

Plus fact based videos require research, sourcing and editing.

Emotional fiction only takes as long to create as a daydream.

[–] [email protected] 2 points 11 hours ago

Creation if a lot more difficult than destruction.

Yup. A lesson that I fear we will be learning over and over and over in the coming years.

load more comments (1 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] [email protected] 24 points 19 hours ago (4 children)

Filter bubbles are the strongest form of propaganda.

load more comments (4 replies)
[–] [email protected] 47 points 22 hours ago* (last edited 22 hours ago)

Do these companies put their fingers on the scale? Almost certainly

But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc

This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.

Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).

You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.

This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up

The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”

This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products

If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings

[–] [email protected] 1 points 10 hours ago* (last edited 10 hours ago)

Almost no corporation has benefits operating in a liberal/left country, they are harder to exploit and make profit of. Why would they promote things like worker protection, parental leave, unions, reducing their own rights to favor the society, paying for healthcare etc? Edit: Wording

[–] [email protected] 9 points 18 hours ago

With Milo (miniminuteman) in the thumbnail, I thought the video was going to imsinuate that his content was part of the alt-right stuff. Was confused and terrified. Happily, that was not the case.

[–] [email protected] 15 points 20 hours ago (10 children)

Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.

The existence of proportionately more "right" leaning content than "left" leaning content could adequately explain the outcomes.

load more comments (10 replies)
load more comments
view more: ‹ prev next ›