this post was submitted on 11 Dec 2023
439 points (93.0% liked)

Technology

58150 readers
4344 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Politically-engaged Redditors tend to be more toxic -- even in non-political subreddits::A new study links partisan activity on the Internet to widespread online toxicity, revealing that politically-engaged users exhibit uncivil behavior even in non-political discussions. The findings are based on an analysis of hundreds of millions of comments from over 6.3 million Reddit users.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 88 points 9 months ago* (last edited 9 months ago) (5 children)

Political topics are also the topics that are most strongly gamed by political actors using Persona Management software to make it seem like their opinion is in the majority. The idea that people who participate in things such as "forum sliding" aren't toxic in their interactions is absurd, so we're left with assuming a large number of these toxic accounts aren't actually real people.

I'm not saying people deep into politics can't be toxic. Plenty of them are, sure. However, it's in the interest of people with political power (especially politicians with politically unpopular ideas) to make regular people not want to participate in politics. One way you do that is to make all political people seem unhinged, angry, and just terrible. People wonder why hardly anyone votes in elections, this kind of stuff is why, and it's not on accident that these folks seem like the majority.

I'm fully convinced the majority of them are bots trying to make politics in general seem more toxic than it actually is to dissuade more people from even wanting to be involved. The intent is to drive political apathy.


Sources:

US government developing Persona Management software in 2011: https://www.theguardian.com/technology/2011/mar/17/us-spy-operation-social-networks

Eglin Air Force Base is most "Reddit Addicted City" in 2014: https://web.archive.org/web/20160604042751/http://www.redditblog.com/2013/05/get-ready-for-global-reddit-meetup-day.html

One of many research papers on Persona Management and Influencing Social Networks from Eglin AFB: https://arxiv.org/pdf/1402.5644.pdf


Helpful Reading Materials:

The Gentleperson's Guide To Forum Spies: https://cryptome.org/2012/07/gent-forum-spies.htm

[–] [email protected] 25 points 9 months ago (2 children)

100% agree with you. The worst part is the bots are getting better and better. I have a policy that you respond once to clarify and then walk away. These are for obvious bad actors, but now they're seeming more and more like decent people with a flawed idea until you keep talking and realize it's a bot. I don't know how to counteract that.

[–] [email protected] 38 points 9 months ago (1 children)

I don't know how to counteract that.

Simple. You don't. When I'm debating, I'm usually not trying to convince the person I'm debating with. I'm trying to convince a disinterested third party who reads the exchange later.

[–] [email protected] 11 points 9 months ago

I completely agree that it's for the later people, it's just a waste of time for me when it's become a lengthy thread that nobody is going to read anyway.

The other thing they do is a bot attack of taking what people are saying, changing it, and then posting a lot of them to bury comments that they don't want others to see. Not sure how to counteract that either.

[–] [email protected] 10 points 9 months ago (2 children)

How do you know they're actually bots? 90% of the time, when I'm debating with someone who is passionately defending their position, they'll at some point accuse me of being a bot or a shill. I also can't recall any time I've debated someone and have been convinced they are a bot.

I'm just skeptical as it's a convenient ad hominem.

[–] [email protected] 6 points 9 months ago

To be totally honest with you, I wouldn't for one second be surprised if the bots are programmed to accuse humans of being bots.

[–] [email protected] 4 points 9 months ago (1 children)

I have the same question. How do you distinguish an advanced enough bot from a genuinely dumb person?

[–] [email protected] 5 points 9 months ago

Or "smart" person. There are almost certainly bots who espouse beliefs that align with yours too.

[–] [email protected] 10 points 9 months ago (1 children)

Up until a few weeks ago, it seemed these bots were mostly absent on Lemmy.

But recently, I have noticed they have arrived here, too.

I fully agree with your analysis.

[–] [email protected] 14 points 9 months ago (1 children)

In what way? Lemmy has been very political from the start. It arguably got less-so after the influx of redditors.

What are you seeing in the last month or so that makes you think there's something more abnormal happening than usual?

[–] thepiggz 6 points 9 months ago (1 children)

Intriguing. I don’t totally know what I think about this argument. A purposeful initiative to make politics toxic to get people to stop paying attention. It’s not one I had totally considered before. You think that’s really going on?

I have had many experiences with real people not on the internet that seem to fixate largely on politics and believe so fervently that they are right that they allow themselves to become toxic. I always thought it was a kind of inconsistent latent belief in utilitarianism combined with overconfidence.

[–] [email protected] 10 points 9 months ago* (last edited 9 months ago) (1 children)

I'm not saying those people don't really exist, there are tons of them out there for sure, but we also have extensive evidence of governments doing this.

GCHQ in the UK had JTRIG using many forum disruption techniques.

There's also the Five Eyes and how they use information sharing to essentially do an end-run around being able to spy on their own citizens. Technically, they're not spying on their own citizens, a foreign nation is, they just so happen to have an agreement with that foreign nation to get info on their own citizens.

The US definitely engages in this kind of stuff on foreign nations as well. They tried to create a social media service for Cuba to influence Cuban politics and do information gathering.

Do either the UK or the US have to spy on their own citizens if they can rely on each other to run influence campaigns in each others countries? The US had to apologize to Angela Merkel for tapping her phone.

Israel has many different programs aimed at managing the PR of the state of Israel online. From paying college students to speak positively of Israel online to having "Think Tanks" use teams of people to influence Wikipedia.

We know that Hacking Team was selling their surveillance software to oppressive regimes, who were definitely using it to oppress the population. If they're using these kind of tools, they're using online disinformation tools as well.

So once again, there's tons of real life absolute maniacs when it comes to politics. There's also incentive for governments around the world to run influence campaigns for pennies on the dollar with digital tools in the digital world.

[–] thepiggz 4 points 9 months ago

I think you’re right that there are people out there trying to manipulate and influence social media - I mean even that platforms themselves do this to a certain extent.

The idea that they purposely try to make it toxic to push the more intellectually-honest, emotionally-controlled people out of the conversation is the interesting part to me.

This particular facet feels less like intentional manipulation and more like a side-effect of our platforms and how they function.

[–] [email protected] 3 points 9 months ago

Found a reddit mod with a dozen plus accounts. Made a new account to disagree with me, I pointed it out, and he denied it, but never used that account again.

It was probably just someone with no life, but I'd feel better about the world if he were being paid for it.

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago) (1 children)

Are there any sources on this from the last decade?

Because I'm not sure if you noticed, but 2016 was kind of a big moment for politics and it triggered a lot of anger and controversy. Politics on social media are a very different thing now than they were in 2011/2012. Which is to say nothing of the well-documented uptick in foreign troll farms and manipulative content sorting, which may have been present in the early 2010s, but no where near the degree it was in the latter half, and still is today.

It's also worth pointing out this uptick in "political toxicity" is mirrored in real life. You can't blame the protests and increasingly violent altercations in real life on some psyops trying to make people not engage in politics.

And frankly...if the goal is to get people turned off from voting, they're failing. Turn out has been going up.

[–] [email protected] 0 points 9 months ago

Why, what happened in 2016? Did 46% of registered voters lose their goddamned minds and vote to put an entirely incompetent and demented convicted fraud and rapist sociopath who wears clown makeup in charge of the federal government or something? Why would that increase the fervor of fucking social fucking media for fuck’s sake jesus goddamned christ on a busted motherfucking crutch!!!

Sorry. You were saying?