this post was submitted on 23 Jan 2025
758 points (97.4% liked)

Technology

60830 readers
3985 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 12 hours ago (3 children)

I'll get downvoted for this, with no explanation, because it's happened here and on reddit.

I'm a liberal gun nut. Most of my limited YouTube is watching gun related news and such. You would think I'd be overrun with right-wing bullshit, but I am not. I have no idea why this is. Can anyone explain? Maybe because I stick to the non-politcal, mainstream guntubers?

The only thing I've seen start to push me to the right was watching survival videos. Not some, "dems gonna kill us all" bullshit, simply normal, factual stuff about how to survive without society. That got weird fast.

[–] [email protected] 3 points 5 hours ago

I've noticed most firearms channels steer well clear of politics, unless it's directly related to the topic at hand, I think partly to appeal to an international audience.

I do think the algorithm puts firearms and politics into very separate categories, someone watching Forgotten Weapons probably isn't going to be interested in political content.

[–] JackbyDev 9 points 8 hours ago

Their algorithms are probably good enough to know you're interested in guns but not right wing stuff. Simple as that.

[–] [email protected] 2 points 12 hours ago* (last edited 2 hours ago) (3 children)

Yeah, I don't think I've ever seen alt-right nonsense without actively looking for it. Occasionally I'll get recommended some Joe Rogan or Ben Shapiro nonsense, but that's about it.

I consider myself libertarian and a lot of my watch time is on Mental Outlaw (cyber security and dark web stuff), Reason (love Remy and Andrew Heaton videos), and John Stossel, but other than that, I largely avoid political channels. I watch a fair amount of gun content as well.

If I get recommended political stuff, it's usually pretty mainstream news entertainment, like CNN or Fox News. Even the crypto nonsense is pretty rare, even though I'm pretty crypto-positive (not interested in speculation though, only use as a currency and technical details).

If you're seeing alt-right crap, it's probably because you've watched a lot of other alt-right crap.

[–] [email protected] 11 points 10 hours ago (1 children)

I have had the opposite experience. I watch a few left-leaning commentary channels. Sam Seder, my boy Jesse Dollomore. If I watch a single video about guns (with no apparent ideological divide), within a single refresh I'm getting Shapiro and Jordan Peterson videos. I'm in a red Western state. My subscriptions are mostly mental health, tech, and woodworking. I have to delete history if I stray even a little bit.

[–] [email protected] 1 points 2 hours ago

I've watched some Sam Seder and related as well, mostly if I follow a link from Lemmy or something, but they don't get recommended unless I watch a bunch. I'm more likely to see Bill Maher or something else more mainstream from the left than a smaller podcaster like Maher. I'd say I see Bill Maher about as much as Jordan Petersen, and I almost never watch either.

I'm in a red state too. My voting district is also one of the more conservative in the state (70+% GOP according to voting stats), though I work in one of the more liberal areas.

[–] [email protected] 5 points 9 hours ago (2 children)

My watch history would peg me as NOT a Republican. Youtube's short feed will serve me

  • excerpt from youtuber's longer video
  • tiktok repost from like, the truck astrology guy or "rate yer hack, here we go" guy, etc
  • Artificial voice reading something scraped from Reddit with Sewer Jump or Minecraft playing in the background
  • Chris Boden
  • Clip from The West Wing
  • Clip from Top Gear or Jeremy Clarkson's Farm
  • "And that's why the Bible tells us that Jesus wants you to hate filthy fucking liberals."

"Do not recommend channel." "The downvote button doesn't even seem to be a button anymore but I clicked it anyway." "Report video for misinformation and/or supporting terrorism." But the algorithm keeps churning it up.

[–] [email protected] 1 points 1 hour ago

truck astrology guy

Huh, never seen that, but of course that exists. I watched part of one and it was as cringy as I thought it would be.

From that list, only one is anything close to "alt right" (last one). I'm guessing a lot of people that like truck astrology or top gear also watch alt right crap, so whatever is causing you to be recommended those videos is probably leading to the last.

I don't think YouTube's algorithm looks at content, it probably more looks at what other people watched that watched similar videos as you.

[–] [email protected] 5 points 8 hours ago (1 children)

Guy you replied to is trying to pretend his individual experience is representative of the whole.

[–] [email protected] 1 points 6 hours ago

I'm not sure there is a "representative of the whole" here; I think the Youtube algorithm is modal.

I think it's an evolution of the old spam bots, like if you had an email address that in any way indicated you were male you'd get "v1agra" and "c1alis" ads nonstop, I'm sure you'd get makeup and breast enlargement spam or some shit in a woman's inbox, whatever they can make you feel insecure enough to buy.

[–] [email protected] 4 points 11 hours ago (1 children)

Or the people around you do. I'm shocked occasionally after going home from work lol.

[–] [email protected] 1 points 1 hour ago (1 children)

Perhaps. I'm in a very red part of a very red state, so following that logic, my feed would be filled with that crap.

I mostly get tech videos because I mostly watch tech videos, and if they mention politics, they tend to be on the left end of the spectrum, because tech people lean left.

[–] [email protected] 1 points 1 hour ago (1 children)

I don't work with tech people. Tech people are smart. They know how to prevent cross contamination of social media. Lol tech people also know how to curate opinions to suit the situation. My experience has been tech people are privately very conservative. Kinda how everyone was shocked when they found out what a piece of shit musk was. But what do I know I hope I'm wrong but I also made a lot of money betting trump would win this election.

[–] [email protected] 1 points 1 hour ago (1 children)

Tech people are smart

I don't think that's true, or at least I don't think tech people are smarter on average. There are a lot of "blue collar" people in tech, by which I mean they learned a skill and apply it according to orders.

I don't know what you consider "smart," but I recommend talking about serious issues with a tech person and someone working a skilled blue collar job (e.g. mining engineer, metal fabrication, etc), and I bet you'll have a similar experience. Some of my favorite people to talk to as a kid worked in construction or something, because they had a very practical form of intelligence that really resonated with me, instead of the airy BS I got from financial or tech people.

People say tech people are smart, but as someone who works in tech, I don't buy it. I think tech people are just like anyone else, they just have an aptitude for coding.

[–] [email protected] 1 points 33 minutes ago

Sure pal, whatever you say. I have a feeling them blue collar boys have a different opinion but know your soft. Which is why you feel like the financial and tech people bully you cuz.