this post was submitted on 21 Jul 2024
607 points (97.9% liked)

World News

38705 readers
12 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

How do the algorithms of Facebook and Instagram affect what you see in your news feed? To find out, Guardian Australia unleashed them on a completely blank smartphone linked to a new, unused email address.

Three months later, without any input, they were riddled with sexist and misogynistic content.

Initially Facebook served up jokes from The Office and other sitcom-related memes alongside posts from 7 News, Daily Mail and Ladbible. A day later it began showing Star Wars memes and gym or “dudebro”-style content.

By day three, “trad Catholic”-type memes began appearing and the feed veered into more sexist content.

Three months later, The Office, Star Wars, and now The Boys memes continue to punctuate the feed, now interspersed with highly sexist and misogynistic images that have have appeared in the feed without any input from the user.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 131 points 5 months ago (19 children)

I mean it‘s the exact same if you visit Youtube without an account or cookies. The Internet has become a swamp of right wing and neoliberal populism that kicks down on minorities and people with lower than average income in general. The insane amount of completely made up rage bait stories that you get recommended is just unfathomable.

I think it‘s gotten to a point where it needs to be regulated how many lies a site can throw at you at the same time and I don‘t say this lightly. I just see no other way to get this mind eating populist machine under control.

[–] Ismay 51 points 5 months ago (4 children)

Youtube shorts are the worst. I am transgender so my historic is really not right wing

30mins on shorts and I end up in Shapiro's Dreamland. It's a nightmare

[–] [email protected] 13 points 5 months ago (1 children)

Pirate Software has the only shorts on YT worth watching imho

[–] [email protected] 8 points 5 months ago

My favourite thing about his shorts is trying to guess if it’s going to be a wholesome one or one where either him or his chat bestow cursed knowledge onto each other

[–] [email protected] 10 points 5 months ago

The only political thing I watch is Behind the Bastards and the algorithm keeps trying to get me to watch right wing shit. I installed a third party channel blocker and open suspicious channels in private tabs and it still serves me right wing chuds complaining about being suppressed by "woke google". I would troll the comments section if it wasn't for the fact that that counts as "engagement".

load more comments (2 replies)
[–] [email protected] 6 points 5 months ago (2 children)

Regulation won't work, because regulation moves slowly, and these companies find workarounds fast. And as long as the cost of breaking the rule is less than the benefits of doing so, it'll be "just the cost of doing business."

[–] [email protected] 10 points 5 months ago

A simple way to do it is to stop considering them as platform providers but editors and it shou'd be done in my opinion because by their recommandation systems, they are making editing choices.

[–] [email protected] 6 points 5 months ago

I have to hard disagree on this outmost pessimistic outlook because it reads like any regulation we already have is pointless so we can just scrap regulations and rules altogether across the board. That's similar to the neoliberalist rethoric I loathe to see pushed into my recommendations and it's simply not true. In reality we do see that regulations sometimes do the trick. It's just that they likely won't regulate them as harsh as I proposed, but that's a different argument. Regulation as an instrument does work.

load more comments (17 replies)
[–] [email protected] 93 points 5 months ago (1 children)

I was explaining this to my daughter not long ago when she told me she kept getting recommended videos about something that offended her (I can't remember what, but something Republicans would be in favor of) on YouTube that the algorithm doesn't care whether or not you agree with the videos. It only cares about whether or not you'll watch them. And if you're willing to hate-watch, which many people are, you'll get served the same videos as the people who enjoy it.

And, of course, the more controversial the better because you'll get a whole lot of both groups. So if you post something sexist and hateful, you'll get a huge number of redpill viewers and the like and then all the other people who go to that post to argue with them. Which means the algorithm learns that those are the best things to push on new accounts too.

[–] [email protected] 13 points 5 months ago (4 children)

You engauge with it, you must like it.

[–] [email protected] 37 points 5 months ago (1 children)

That’s just not true. Many people engage with things they don’t like because of education and curiosity.

For instance, I don’t like your comment but I still engaged with it to point out that you’re wrong. I like Lemmy, in general, though.

[–] [email protected] 31 points 5 months ago (2 children)

I took that comment to be summarizing the platform’s perspective, rather than their own. I think it’s common sense that people will watch/engage with things they don’t like, but the algorithms don’t care about how you feel or why you watched something; they see engagement and they give you more of that thing to drive more engagement. As far as the unfeeling numbers go, engagement might as well be liking; they don’t need to distinguish a difference.

load more comments (2 replies)
[–] [email protected] 20 points 5 months ago (1 children)

As someone who used to sit around at a TV station for hours waiting for news to happen so I could go shoot it, I can tell you for a fact that you don't have to like a show to watch it. You just have to be bored and it's in front of you.

That said, I did find out that American Ninja Warrior was amusing.

[–] [email protected] 9 points 5 months ago

Howard Stern's entire career was famously based on that fact.

[–] [email protected] 8 points 5 months ago

You engaged with this post, you must like it.

[–] [email protected] 5 points 5 months ago

No, they think "You engage with it, that makes us money."

[–] [email protected] 57 points 5 months ago (1 children)

That's what happens if you only care about engagement. Chauvinism of any kind is liked by a certain amount of people and despised by the rest of us - both positions drive engagement.

[–] [email protected] 6 points 5 months ago

That’s why a format like lemmy is the only way. No desire for profit means we let the content do what it does organically.

[–] [email protected] 37 points 5 months ago* (last edited 5 months ago)

I think they vastly underestimate how many things Meta tracks besides ad tracking. They're likely tracking how long you look at a given post in your feed and will use that to rank similar posts higher. They know your location, what wifi network you're on and will use that to make assumptions based on others on the same network and/or in the same location. They know what times you're browsing at and can correlate that with what's trending in the area at those times, etc.

I have no doubt that their algorithm is biased towards all that crap, but these kinds of investigations need to be more informed in order for them to be useful.

[–] [email protected] 32 points 5 months ago* (last edited 5 months ago) (1 children)

Well it also looks at stuff like the other devices connected from the same IP, other devices near where you connect from and then based on this also hones the served content.

So if the author/some colleague or even the neighbors are red pilled/MGTOW/Chriso-Fachists etc this also makes sense. Possibly even their research into these subjects slanted the results. Let alone what happens if they spun up a VM at a cloud farm and used it from there.

I'm in no way surprised that "social" media corps serve up vile shit for profit ik just not convinced by some random let's see what happens.

Edit: I'd be for a law that required targeted ads to have a small "why you see this" and if you click it the company is required to show you the selection criteria that caused this ad to be served to you in an easy to understand format. (Leaving out all the irrelevant criteria).. ie.

You where selected by the following criteria:

  • region: europe
  • gender: male
  • interests: Games, Lemmy, politics
[–] [email protected] 21 points 5 months ago (1 children)

It's also probably looking at scroll speed. So if the people conducting the experiment tended to linger longer examining content they disliked, that could result in getting more of it.

Would need to see a more detailed explanation of the methodology. Ideally the scrolling was done in an automated way, at a consistent speed.

[–] [email protected] 9 points 5 months ago

Yep, it’s called dwell time and it is 100% one of the metrics used by the algorithms that decide what content to serve up.

[–] [email protected] 29 points 5 months ago (5 children)

When I first joined Internet communities as a preteen, I just followed forums that interested me and got exposed to whatever people happened to be talking about on those forums.

Why, oh why, has the world decided that we need recommendation algorithms at all?

[–] [email protected] 11 points 5 months ago

The algorithms aren't there to improve the user experience they're there to increase user engagement. People engage with things positively and they engage with things negatively. The algorithm doesn't care.

Why is every third Reddit post someone "accidentally misspelling" or otherwise humorously butchering a post title? Because people comment on it.

load more comments (3 replies)
[–] [email protected] 25 points 5 months ago* (last edited 5 months ago) (6 children)

I have a ~3year old fb account because I have to use messenger. I don't use FB almost at all (I even sparingly accept friend requests) and have turned off ~anything that provides targeted content (I live in EU).

Since about last year (or possibly even before), my feed is about 35% Ikaria ads (a Greek island, I'm from Greece), 10% porn, 10% sexist-misogynistic stuff, 15% sexist-misogynistic porn, 15% christian stuff and the rest random stuff.

At least this might confirm that turning off targeted content works..🤷

[–] [email protected] 6 points 5 months ago

I use Facebook because my relatives are spread out across the world and that's the way I know to stay in touch and also my brother, who is neurodivergent, mostly likes to communicate to the family as a whole that way and I want to be able to stay in touch with him too. On top of that, I'm stuck in a town I hate with no friends here and some old friends from my hometown are there so I can talk to them.

Anyway, ever since my brother started talking about how he was taking various hallucinogenic substances and calling himself a psychonaut (he's almost 60, he got into it very late), most of the ads I see are for shroom gummies, ketamine and boner pills. I've done my "psychonaut" stuff back when I was in my teens and twenties. I'm not interested in the former two and the latter is, thankfully, not necessary yet.

The funny thing was that maybe 4 or 5 years ago, I kept getting shown an ad for a wooden hurdy-gurdy kit. Like the medieval instrument. I have no idea why. I have never expressed an interest in playing the hurdy-gurdy, listening to the hurdy-gurdy or building anything out of a wooden kit. It became a joke with me and my friends for a while.

load more comments (5 replies)
[–] [email protected] 20 points 5 months ago* (last edited 5 months ago) (2 children)

So these algos they time how long you look at things when scrolling. So to begin, sure there's no user input, but the mere act of looking unless controlled to be exactly the same on everything is in fact input.

Getting a sexist thing and lingering on it for an extra 5 seconds tells the algo you engaged with the sexist thing so send them more sexist things.

Edit: it shouldn't be serving it up at all, but that's how this can happen.

load more comments (2 replies)
[–] [email protected] 14 points 5 months ago (1 children)

we opted out of ad tracking so it could not tell what we were doing outside of the app

[–] [email protected] 6 points 5 months ago

Or so it said.

[–] [email protected] 12 points 5 months ago

One of the rules of the Internet is "unsubscribe from the defaults."

[–] [email protected] 7 points 5 months ago

Of course, young men were starting to be much too peaceful and open-minded.

[–] [email protected] 5 points 5 months ago

This is the best summary I could come up with:


On Instagram, while the explore page has filled with scantily-clad women, the feed is largely innocuous, mostly recommending Melbourne-related content and foodie influencers.

Nicholas Carah, an associate professor in digital media at the University of Queensland, said the experiment showed how “baked into the model” serving up such content to young men is on Facebook.

She praises the federal government’s Stop it at the Start campaign, which includes an “Algorithm of Disrespect” interactive depicting what a young man may encounter on social media.

The federal government has also funded a $3.5m three-year trial to counteract the harmful impacts of social media messaging targeting young men and boys.

The social services minister, Amanda Rishworth, says combatting misogynistic attitudes and behaviour in the online and offline world will help achieve the national plan to end violence against women and children in one generation.

“Around 25% of teenage boys in Australia look up to social media personalities who perpetuate harmful gender stereotypes and condone violence against women - this is shocking,” she says.


The original article contains 1,154 words, the summary contains 170 words. Saved 85%. I'm a bot and I'm open source!

load more comments
view more: next ›