this post was submitted on 28 Aug 2023
1746 points (97.9% liked)

Lemmy.World Announcements

28381 readers
1 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages πŸ”₯

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to [email protected] e-mail.

Report contact

Donations πŸ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
 

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @[email protected] the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 471 points 1 year ago (8 children)
[–] [email protected] 300 points 1 year ago (3 children)

troll is too mild of an adjective for these people

[–] [email protected] 291 points 1 year ago (1 children)

How about "pedophile"? I mean, they had to have the images to post them.

[–] [email protected] 65 points 1 year ago (7 children)

"Terrorist". Having the images doesn't mean they liked them, they used them to terrorize a whole community though.

load more comments (7 replies)
[–] [email protected] 53 points 1 year ago (2 children)

Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go β€œoh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 133 points 1 year ago (41 children)

That's not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.

load more comments (41 replies)
[–] [email protected] 51 points 1 year ago
load more comments (5 replies)
[–] [email protected] 332 points 1 year ago (15 children)

I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don't worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won't. Dm me If you wish to apply for mod.

Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

[–] [email protected] 208 points 1 year ago (1 children)

@[email protected] this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

[–] [email protected] 52 points 1 year ago (4 children)

Hopefully the devs will take the lesson from this incident and put some better tools together.

[–] [email protected] 49 points 1 year ago

There's a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren't already aware.

load more comments (3 replies)
[–] [email protected] 73 points 1 year ago

Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.

load more comments (13 replies)
[–] aport 267 points 1 year ago (2 children)
[–] [email protected] 193 points 1 year ago (1 children)

This isn't as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

[–] [email protected] 95 points 1 year ago (1 children)

This doesn't sound crazy in the least. It sounds like exactly what should be done.

[–] [email protected] 45 points 1 year ago (2 children)

yha, what do people think the FBI is for... this isn't crazy. They can get access to ISP logs, VPN provider logs, etc.

load more comments (2 replies)
[–] [email protected] 115 points 1 year ago (15 children)

This is good advice; I suspect they're outside of the FBI's jurisdiction, but they could also be random idiots, in which case they're random idiots who are about to become registered sex offenders.

[–] [email protected] 45 points 1 year ago (2 children)

They might be, but I'd imagine most countries have laws on the books about this sort of stuff too.

load more comments (2 replies)
load more comments (13 replies)
[–] [email protected] 169 points 1 year ago (3 children)

This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

Talk to your children about online safety and the dangers of CSAM.

Teach your children about the importance of keeping their personal information private. Monitor your children's online activity.

Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

load more comments (3 replies)
[–] [email protected] 149 points 1 year ago (36 children)

Not that I'm familiar with Rust at all, but... perhaps we need to talk about this.

The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesn’t seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

Lets be productive. What exactly are the moderation features needed, and what would be easiest to implement into the Lemmy source code? Are you talking about a mass-ban of users from specific instances? A ban of new accounts from instances? Like, what moderation tool exactly is needed here?

[–] [email protected] 116 points 1 year ago (8 children)

Speculating:

Restricting posting from accounts that don't meet some adjustable criteria. Like account age, comment count, prior moderation action, average comment length (upvote quota maybe not, because not all instances use it)

Automatic hash comparison of uploaded images with database of registered illegal content.

[–] [email protected] 65 points 1 year ago

On various old-school forums, there's a simple (and automated) system of trust that progresses from new users (who might be spam)... where every new user might need a manual "approve post" before it shows up. (And this existed in Reddit in some communities too).

And then full powers granted to the user eventually (or in the case of StackOverlow, automated access to the moderator queue).

load more comments (7 replies)
load more comments (35 replies)
[–] [email protected] 145 points 1 year ago (4 children)

The amount of people in these comments asking the mods not to cave is bonkers.

This isn’t Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they don’t have any deep understanding of.

load more comments (4 replies)
[–] [email protected] 141 points 1 year ago (5 children)

This is flat out disgusting. Extremely questionable someone having an arsenal of this crap to spread to begin with. I hope they catch charges.

load more comments (5 replies)
[–] [email protected] 118 points 1 year ago (38 children)

There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

this doesn't seem like a respectful comment to make. People have responsibilities; they aren't paid for this. It doesn't seem to fair to make criticisms of something when we aren't doing anything to provide a solution. A better comment would be "there are just 2 full time developers on this project and they have other priorities. we are working on increasing the number of full time developers."

[–] [email protected] 92 points 1 year ago (12 children)

Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.

On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.

load more comments (12 replies)
[–] [email protected] 53 points 1 year ago

I agree with you, I'd just gently suggest that it's borne of what is probably significant upset at having to deal with what they're having to deal with.

load more comments (36 replies)
[–] [email protected] 110 points 1 year ago (2 children)

Fucking bastards. I don't even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.

load more comments (2 replies)
[–] [email protected] 109 points 1 year ago (76 children)

I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that's extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

load more comments (76 replies)
[–] [email protected] 97 points 1 year ago

Please get some legal advice, this is so fucked up.

[–] [email protected] 82 points 1 year ago (1 children)

Genuine question: won't they just move to spamming CSAM in other communities?

[–] [email protected] 79 points 1 year ago (9 children)

We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

It's likely that we'll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don't blame them).

load more comments (9 replies)
[–] [email protected] 77 points 1 year ago (3 children)

Sounds like the 4chan raids of old.

Batten down, report the offender's to the authorities, and then clean up the mess!

Good job so far ^_^

load more comments (3 replies)
[–] [email protected] 70 points 1 year ago* (last edited 1 year ago) (17 children)

How does closing lemmyshitpost do anything to solve the issue? Isn't it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?

load more comments (17 replies)
[–] [email protected] 64 points 1 year ago (1 children)

Is it possible to (at least temporarily):

  1. Turn off instance image hosting (disable pictrs)
  2. Disallow image and video posts across all communities
  3. As in Firefish, turn off caching of remote images from other instances.

whilst longer term solutions are sought? This would at least ensure poor mods aren't exposed to this shit and an instance could be more positive they're not inadvertently hosting CSAM.

load more comments (1 replies)
[–] [email protected] 63 points 1 year ago

good thing you did it the way you did nobody should have to look at awful stuff like this. keep your mind healthy nobody should have to deal with that

[–] [email protected] 62 points 1 year ago (1 children)

Thank you so much for all of the effort and time all of you are putting into this situation. Having to deal with bad actors is one thing, but you are now dealing with images that are traumatizing to view.

Please, for your sanity and overall well being, PLEASE take care of yourself. Yes, it sucks about having to close !lemmyshitpost, but self-care and support are of the utmost importance.

load more comments (1 replies)
[–] [email protected] 60 points 1 year ago* (last edited 1 year ago) (17 children)

I'm afraid the fediverse will need a crowdsec-like decentralized banning platform. Get banned one platform for this shit, get banned everywhere.

I'm willing to participate in fleshing that out.

Edit: it's just an idea, I do not have all the answers, otherwise I'd be building it.

load more comments (17 replies)
[–] [email protected] 59 points 1 year ago (27 children)

I assume you've contacted the FBI, but if not PLEASE DO.

[–] [email protected] 54 points 1 year ago (3 children)

lemmy.world is based in Finland.

[–] [email protected] 91 points 1 year ago (3 children)

Yes, the Finnish Bureau of Investigation

load more comments (3 replies)
load more comments (2 replies)
load more comments (26 replies)
[–] [email protected] 57 points 1 year ago* (last edited 1 year ago)

Thank you for your work to keep that despicable trash out of our feeds. Sorry you have to deal with it. Fuck those losers.

[–] [email protected] 53 points 1 year ago

Thank you for all your work. It sucks that there are people who would do shit like this. Please don't forget to take care of yourselves as well.

[–] [email protected] 48 points 1 year ago (4 children)

Looks like Google has some tooling available that might help: https://protectingchildren.google/tools-for-partners

Probably other options too.

load more comments (4 replies)
[–] [email protected] 48 points 1 year ago (19 children)

Who the fuck has such a problem with this instance?

load more comments (19 replies)
load more comments
view more: next β€Ί