this post was submitted on 28 Aug 2023
343 points (98.3% liked)

FediLore + Fedidrama

2341 readers
49 users here now

Rules

  1. Any drama must be posted as an observer, you cannot post drama that you are involved with.
  2. When posting screenshots of drama, you must obscure the identity of all the participants.

Chronicle the life and tale of the fediverse (+ matrix)

Largely a sublemmy about capturing drama, from fediverse spanning drama to just lemmy drama.

Includes lore like how a instance got it's name, how an instance got defederated, how an admin got doxxed, fedihistory etc

(New) This sub's intentions is to an archive/newspaper, as in preferably don't get into fights with each other or the ppl featured in the drama

Tags: fediverse news, lemmy news, lemmyverse

Partners:

founded 2 years ago
MODERATORS
 

They also shut down registration

Whoever is spamming CP deserves the woodchipper

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 146 points 1 year ago (12 children)

The fact that some of you are putting the blame on instance owners/moderators is just showing that you have about the same amount of brain rot as the people actually posting this vile trash

[–] [email protected] 28 points 1 year ago

Right. This is a community effort, and it's important we support our instances and figure out how to best keep them safe.

load more comments (11 replies)
[–] [email protected] 53 points 1 year ago (1 children)

These comments so far stink, yall are something else.

[–] [email protected] 95 points 1 year ago (19 children)

OK, I am going to take a minute away from the shit stirring and potentially provide some insight speaking as an admin who's had the misfortune of dealing with this so I can maybe shift this comment section into an actually meaningful discussion.

You can have your own opinion and feelings against lemmy.world but, this?

The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesn’t seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

This is correct. Most lemmy admins likely agree as well, I don't speak for anyone but myself but I can say that I think it would be hard to find someone who disagreed. What happened today is a result of a catastrophic failure on lemmys end, with issues that should have been addressed over a month ago just being completely ignored. The lemmy devs shared a roadmap during their AMA & they essentially were more concerned with making shit go faster... that's about it.

[–] [email protected] 26 points 1 year ago (3 children)

Okay, honest question. What mod tools are lacking. If there's something needed, what is that thing or things?

I went over to the feature request page for Lemmy and I couldn't find anything massive in terms of requests for moderation tools that would have been sure fire ways to stop this particular event.

That said, there is over 400 open feature requests alone on Lemmy's github. I obviously couldn't go through every single one. But coming from the kbin side I'm just curious about our Lemmy brothers and sisters. It sounds dire and I'm woefully under informed on how bad it is.

[–] [email protected] 25 points 1 year ago

There aren't enough roles. There's admin, moderator, and user, but it would be best to have tiers of user in between. Reports go to 4 categories of user when you file a report. Report a comment for violating a fun rule your community decided to implement (all post titles must contain "Jon Bois Rules!")? That report goes to: the community moderators (good), the community's host instance's admin (bad), your instance's admin (bad), the user who posted the "offending post"'s instance's admin (bad).

Only admins can permanently remove illegal content. If a mod "removes" it, it still sits visible to all in modlog, and for the purposes of CSAM specifically, that counts as distribution which is prosecuted as a worse crime than possession. Federation with other instances is effectively binary. You can or cannot federate, you cannot set traffic as unidirectional like you can on most other fediverse platforms. The modlogs make it hard to parse who the moderator performing an action is acting on the behalf of. Was it a community mod? An admin? Your admin?

There's more but my phone is getting low on battery

[–] [email protected] 16 points 1 year ago (6 children)

Agreed, I don't know what AutoMod did on Reddit but if what mods need is a rule-configurable post remover then I'd be happy to clobber together something in Python

[–] [email protected] 13 points 1 year ago (1 children)

There's this bot that is used in a couple of communities on feddit.de:

https://github.com/Dakkaron/SquareModBot

load more comments (1 replies)
load more comments (5 replies)
[–] [email protected] 12 points 1 year ago

Here's some things Beehaw admind have been asking for from moderation since June: https://beehaw.org/comment/397674

See github issues #3255 and #3275

[–] [email protected] 12 points 1 year ago (2 children)

As an admin, how do kbin moderation tools compare?

Also does lemmy.world have the spare cash to offer cash for features?

[–] [email protected] 27 points 1 year ago

Kbin moderation tools are worse. And potentially. I guess a bug bounty could be started up.

load more comments (1 replies)
[–] [email protected] 11 points 1 year ago

I don't know this for sure, but I have a feeling that a hard fork is in Lemmy's future. I don't want to get super into it, but programming is a form of communication. What features you bake into a platform are reflective of the messages you want to propogate on that platform. Lemmy's devs vision for what the platform should be might not be reflective of what most of us might think it should be. The moderation tools might not be a focus for a while, even if most of us view that as the greatest need

load more comments (16 replies)
[–] [email protected] 50 points 1 year ago (2 children)

Looks like some CSAM fuzzy hashing would go a long way to catch someone trying to submit that kind of content if each uploaded image is scanned.

https://blog.cloudflare.com/the-csam-scanning-tool/

Not saying to go with CloudFlare (just showing how the detection works overall), but some kind of builtin detection system coded into Lemmy that grabs an updated hash table periodically

[–] [email protected] 25 points 1 year ago (1 children)

Not a bad idea, but I was working on a project once that would support user uploaded images and looked into PhotoDNA, but it was an incredible pain in the ass to get access to. I'm surprised that someone hasn't realized that this should just be free and available. Kind of gross that it is put behind an application/paywall, imo. They're just hashes and a library to generate the hashes. Why shouldn't that just be open source and available through the NCMEC?

load more comments (1 replies)
[–] [email protected] 38 points 1 year ago (1 children)

Is there not some way to involve the authorities? I feel like FBI/CIA or other foreign agencies would love to track down whoever is distributing. Like set up some sort of honeypot instance to catch them

[–] [email protected] 12 points 1 year ago (1 children)

They probably connect using tor. Not much you can do with that information (without effort far exceeding the value of one CP spammer).

[–] [email protected] 11 points 1 year ago (5 children)

Doesn't the NSA run half of all Tor exit nodes?

load more comments (5 replies)
[–] [email protected] 33 points 1 year ago (1 children)

I'm a bit confused, how does locking down a single community help?

Are the spammers really just focusing on one community instead of switching to the next after it gets banned?

I do hope there is an IP ban option, so someone can't just use the same IP again to create an account on another instance and post CSAM from there. Obviously I do know about VPNs, but it makes it a tiny bit more difficult to spam in large amounts.

[–] [email protected] 11 points 1 year ago (9 children)

Most people don't have static IP addresses, so banning their IP will only stop them temporarily. Then whoever gets that dynamic IP address next will be banned too. Then there's CGNAT where 1 IP address can have up to 128 people using it at once and the address changes even more frequently.

load more comments (9 replies)
[–] [email protected] 19 points 1 year ago (3 children)

Is it that hard to not be completely retarded and innapropriate on the internet for these people? Only "viable" alternative to reddit and they have to fuck it up

[–] [email protected] 30 points 1 year ago

I'd assume that fucking it up is the goal. Some people are just irredeemable sociopaths who get satisfaction out of ruining other people's days.

load more comments (2 replies)
[–] [email protected] 15 points 1 year ago (1 children)

While I understand the move entirely I can't help but wonder if that might have been the intent of the perpetrators.

[–] [email protected] 28 points 1 year ago (3 children)

Definitely was. It was just a flex of their power. I don't see any viable solution at the moment though, so going nuclear was the only sane option. When your options are to close a door versus playing an increasingly difficult game of cat and mouse w/ CP posters, most would opt to temporarily shutter their doors I feel.

What is worrying is that any community on lemmy on any instance is vulnerable to this type of attack. This will continue happening again and again until a clear solution, technical or otherwise, can be devised.

I gave my loyalty to Lemmy. I am not going to jump shit because some deranged lunatics decide to troll in the most abhorrent ways. I plan on donating to the project in show of support and I hope others do as well.

[–] [email protected] 9 points 1 year ago (3 children)

Honestly, I think it was destined to happen one way or another because of an open-signups server getting so big. The burggit/vlemmy debacle was the warning shot.

It should jump-start overdue efforts to improve moderation granularity and make it easier for mods to manage users and content.

load more comments (3 replies)
load more comments (2 replies)
load more comments
view more: next ›