this post was submitted on 16 Dec 2023
265 points (96.2% liked)

Technology

59322 readers
4370 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 11 months ago (1 children)

I'd be really curious to know why this seems to be happening to so many people but not me. I'm a hardcore YouTube addict, but there's zero politics in my feed. I even follow many right-wing gun tubers, watch plenty of police bodycam footage and occasionally might even view one or two videos from people like Jordan Peterson, Joe Rogan and Ben Shapiro but even after that I might only get few more recommendations about their videos and once I ignore them they stop showing up. The only videos YouTube seems to be trying to force feed me are game streamers I've never heard of and judging by the view count on their videos, neither has anyone else.

[–] [email protected] 5 points 11 months ago

You might be in a different test group. They always have a few different groups with different settings to check how well the algorithm is meeting their goals. That's how they know they make less money if they don't radicalize people.