this post was submitted on 13 Oct 2024
69 points (98.6% liked)

Fediverse

28040 readers
1 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to [email protected]!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

What are the risks associated with this? With image uploading capabilities and the like I'm thinking there might be an issue with people posting highly illegal content. I used to run some smaller forums 15 years ago and that went fine, but it feels like the risks are higher today... I'm both thinking about one's own personal mental health in needing to moderate such content, and also whether it'll be a legal liability to run an instance if people post illegal content.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 2 weeks ago (4 children)

I never thought I'd be a registered CSAM reporter with the feds, but then I decided to host public content via Lemmy. Turns out, while 99.9% of users are great or fine, that 0.1% are just assholes for the sake of being assholes

[–] [email protected] 9 points 2 weeks ago (1 children)

I think Lemmy/Mbin would benefit from 'moderation pools'. The basic idea is that, if you subscribe to or join a moderation pool, your instance will automatically copy any moderation action taken on content your instance also hosts. This would allow multiple single-admin instances to moderate even during off-hours of any single admin.

[–] [email protected] 8 points 2 weeks ago (1 children)

That's partially what https://fediseer.com/ does.

The same dev also made a CSAM scanning tool based on AI image recognition.

[–] [email protected] 9 points 2 weeks ago (2 children)

Hmm, this is something I haven't heard about. Can you actually register as an instance hoster with the FBI or equivalent to say "hey I have a service that may be exposed to CSAM, I do not condone this and will report any cases of it that I see"? If so that could reduce a lot of people's specific legal fears of hosting.

[–] [email protected] 6 points 2 weeks ago (1 children)

Not with the FBI, but with the national center for missing and exploited children, who collate reports and work with the FBI. Cloudflare and others have services that route all images through their detection systems and will auto block and report CSAM. I didn't want to use cloudflare, but turns out if somehow I did accidentally host it, I would be charged with hosting it. I have to report it or I'm the responsible party

[–] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago)

That's good to know. I've had some half baked plans to host a public instance for a while (will probably get to it in winter) and honestly the legal risk has been something that's really held me back. Knowing I have a way to cover my ass for removing it is great.

[–] [email protected] 5 points 2 weeks ago

Unfortunately this isn't applicable outside of the US in many cases, like in my case.

[–] [email protected] 8 points 2 weeks ago (1 children)

This is why I decided not to host an instance in the end. Where I live, the laws are such that the hoster is responsible for the content hosted on their servers So if some shitbag posts CP that gets synced to my server and the authorities somehow find out, it would seriously fuck up my life.

[–] [email protected] 5 points 2 weeks ago

Not only do people avoid creating instances for this reason, but several previously existing instances shut down as a result, like DMV.social.

[–] [email protected] 2 points 2 weeks ago (1 children)

If you selfhost a single user instance do you still need to register? I get registering if you host a multiuser instance.

[–] [email protected] 2 points 2 weeks ago

If it's open to public, yes. Even if they don't have an account if they can still see the offending content then yes.

However, I bet if you use nginx you could somehow block public access and require an account. Something like if not login page and not has a token then block