this post was submitted on 30 Aug 2023
374 points (92.3% liked)

Selfhosted

39435 readers
9 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

EDIT

TO EVERYONE ASKING TO OPEN AN ISSUE ON GITHUB, IT HAS BEEN OPEN SINCE JULY 6: https://github.com/LemmyNet/lemmy/issues/3504

June 24 - https://github.com/LemmyNet/lemmy/issues/3236

TO EVERYONE SAYING THAT THIS IS NOT A CONCERN: Everybody has different laws in their countries (in other words, not everyone is American), and whether or not an admin is liable for such content residing in their servers without their knowledge, don't you think it's still an issue anyway? Are you not bothered by the fact that somebody could be sharing illegal images from your server without you ever knowing? Is that okay with you? OR are you only saying this because you're NOT an admin? Different admins have already responded in the comments and have suggested ways to solve the problem because they are genuinely concerned about this problem as much as I am. Thank you to all the hard working admins. I appreciate and love you all.


ORIGINAL POST

cross-posted from: https://lemmy.ca/post/4273025

You can upload images to a Lemmy instance without anyone knowing that the image is there if the admins are not regularly checking their pictrs database.

To do this, you create a post on any Lemmy instance, upload an image, and never click the "Create" button. The post is never created but the image is uploaded. Because the post isn't created, nobody knows that the image is uploaded.

You can also go to any post, upload a picture in the comment, copy the URL and never post the comment. You can also upload an image as your avatar or banner and just close the tab. The image will still reside in the server.

You can (possibly) do the same with community icons and banners.

Why does this matter?

Because anyone can upload illegal images without the admin knowing and the admin will be liable for it. With everything that has been going on lately, I wanted to remind all of you about this. Don't think that disabling cache is enough. Bad actors can secretly stash illegal images on your Lemmy instance if you aren't checking!

These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven't taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.

Only your backend admins who have access to the database (or object storage or whatever) can check this, meaning non-backend admins and moderators WILL NOT BE ABLE TO MONITOR THESE, and regular users WILL NOT BE ABLE TO REPORT THESE.

Aren't these images deleted if they aren't used for the post/comment/banner/avatar/icon?

NOPE! The image actually stays uploaded! Lemmy doesn't check if the images are used! Try it out yourself. Just make sure to copy the link by copying the link text or copying it by clicking the image then "copy image link".

How come this hasn't been addressed before?

I don't know. I am fairly certain that this has been brought up before. Nobody paid attention but I'm bringing it up again after all the shit that happened in the past week. I can't even find it on the GitHub issue tracker.

I'm an instance administrator, what the fuck do I do?

Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.

Good luck.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 85 points 1 year ago (6 children)

seems like the solution to this should be to automatically remove images that haven't been posted, after like 3 minutes

[–] [email protected] 15 points 1 year ago (2 children)

Or make it like 1hr and don't let the user know the url of the uploaded image until they post it, that way it wouldn't be able to be shared or reported.

load more comments (2 replies)
[–] [email protected] 7 points 1 year ago

This is one way to solve it.

load more comments (3 replies)
[–] [email protected] 49 points 1 year ago (3 children)

I'm usually pretty relaxed when it comes to disclosure of vulnerabilities but this is the kind of issues where I think it would have been better to privately report the issue to the Lemmy dev and wait ( a long time probably) for it to be fixed before disclosing.

Especially since currently there is multiple people abusing the image hosting feature.

Not a big deal, but sometimes it is actually a better practice to give an opportunity to the dev to fix something before forcing them to do so in a hurry.

[–] [email protected] 28 points 1 year ago (4 children)

I've mentioned this before to a similar reply. But I'll say it again: this was already publicly known months ago. People just forgot about it because they didn't think it was a big deal. Now that they realize CSAM is a real issue, I made this post to remind everyone about it again. Bad actors already know about this and really, it isn't hard to figure out how this work.

load more comments (4 replies)
[–] [email protected] 9 points 1 year ago

Eh... Better make it public so you don't have people taking a chance with hosting CSAM!

load more comments (1 replies)
[–] [email protected] 48 points 1 year ago (2 children)

This is not unique to Lemmy. You can do the same on Slack, Discord, Teams, GitHub, ... Finding unused resources isn't trivial, and you're usually better off ignoring the noise.

If you upload illegal content somewhere, and then tell the FBI about it, being the only person knowing the URL, let me know how that turns out.

load more comments (2 replies)
[–] [email protected] 46 points 1 year ago* (last edited 1 year ago) (3 children)

Or just disable image uploads completely. We got by on Reddit without any built-in image hosting functionality for over a decade, so Lemmy should be fine without it as well - especially considering that we don't really have many image-heavy communities, besides the NSFW instances. I mean, storage costs money you know, and with Lemmy being run by volunteers, it makes even more sense to get rid of image hosting to save costs.

[–] [email protected] 8 points 1 year ago

I don't have the pictrs container running on my instance.

load more comments (1 replies)
[–] [email protected] 44 points 1 year ago (7 children)

Note, my tools is the only solution that exists (currently) for this in regards to csam

https://github.com/db0/lemmy-safety

[–] [email protected] 10 points 1 year ago

Appreciate your work.

[–] [email protected] 10 points 1 year ago* (last edited 1 year ago) (1 children)

Not hosting images is a far better solution, and also exists.

load more comments (1 replies)
load more comments (5 replies)
[–] [email protected] 28 points 1 year ago (3 children)

This is how it works. Since pictrs and Lemmy are two completely different applications (they even run in two different containers with two different databases) they do not communicate and tracking what images belong to what post or comment simply isn't possible in the current state I guess.

How come this hasn’t been addressed before?

This is how the Fediverse works. There is so much bad practices, so much haphazardly implemented functionality and so much bad API documentation all over the place that I wonder why nothing has extremely exploded so far. We don't even have proper data protection and everything is replicated to everywhere causing a shitload of legal issues all over the workd but no-one seems to care so far.

[–] [email protected] 19 points 1 year ago

Sounds like the Internet Protocol I grew up with 😍

[–] [email protected] 7 points 1 year ago

This isn’t unique to Lemmy or haphazard coding. It’s a common technique to get pictures into Github READMEs this way. You’d create a PR, upload an image, copy the link, delete the PR, and then paste the link elsewhere on Github for use.

[–] [email protected] 7 points 1 year ago

The difference between the Fediverse and a closed system like reddit is that it's open and we're privy to haphazardly implemented functionality and bad API documentation.

I work on big closed source web apps for a living; they're just as haphazard and badly documented, it's just all closed.

[–] [email protected] 28 points 1 year ago (2 children)

Why does Lemmy even ship its own image host? There are plenty of places to upload images you want to post that are already good at hosting images, arguably better than pictrs is for some applications. Running your own opens up whole categories of new problems like this that are inessential to running a federated link aggregator. People selfhost Lemmy and turn around and dump the images for "their" image host in S3 anyway.

We should all get out of the image hosting business unless we really want to be there.

[–] [email protected] 35 points 1 year ago (2 children)

Convenience for end-users and avoiding link rot is probably one of the reasons.

[–] [email protected] 19 points 1 year ago* (last edited 1 year ago)

and avoiding link rot

Lemmy seems built to destroy information, rot links. Unlike Reddit has been for 15 years, when a person deletes their account Lemmy removes all posts and comments, creating a black hole.

Not only are the comments disappeared from the person who deleted their account, all the comments made by other users disappear on those posts and comments.

Right now, a single user just deleting one comment results in the entire branch of comment replies to just disappear.

Installing an instance was done pretty quickly... over 1000 new instances went online in June because of the Reddit API change. But once that instance goes offline, all the communities hosted there are orphaned and no cleanup code really exists to salvage any of it - because the whole system was built around deleting comments and posts - and deleting an instance is pretty much a purging of everything they ever created in the minds of the designers.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 22 points 1 year ago (13 children)

I can’t be the only one getting bored with the 8-hr-old accounts spreading FUD.

If you have a legitimate concern, post it from your proper account. Otherwise it looks like you’re just trolling for Spez. It’s pathetic, really.

[–] [email protected] 6 points 1 year ago (1 children)

Additionally this isn't the community where this needs to be addressed. Either contact the admins or open an issue on GitHub.

load more comments (12 replies)
[–] [email protected] 20 points 1 year ago (1 children)

In theory also possible to just be a nuisance by filling out the instances available space? That sounds like it's gonna get fixed one way or another.

[–] [email protected] 6 points 1 year ago

Yes - that's possible.

[–] [email protected] 20 points 1 year ago (4 children)

the admin will be liable for it.

...

These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven’t taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.

In most jurisdictions this is not now it would work. Even a less tech savvy investigator would figure out that it was an online community not obviously affiliated with CSAM, and focus on alerting you and getting the content removed.

There's this misunderstanding that CSAM is some sort of instant go-to-prison situation, but it really does depend on context. It's generally not so easy to just plant illegal files and tip off the FBI, because the FBI is strategic enough not to be weaponized like that. Keep an eye on your abuse and admin email inboxes, and take action as soon as you see something, and nobody is going to shut you down or drag you to court.

load more comments (4 replies)
[–] [email protected] 16 points 1 year ago (8 children)

This is just like how someone could put printed CSAM behind a bush in my yard or something and some authorities could decide to hold me responsible.

load more comments (8 replies)
[–] [email protected] 13 points 1 year ago

Yeah, this is a big issue. I know Lemmy blew up a bit before it was truly ready for prime time but I hope this cleans up.

[–] [email protected] 8 points 1 year ago (2 children)

In the USA, admins being liable is not really true

[–] [email protected] 8 points 1 year ago (1 children)

Are individuals granted the same 230 protections as organizations when it comes to self-hosting an instance? I doubt people are forming non-profits for their self hosting endeavors

[–] [email protected] 8 points 1 year ago

Most admins aren't in the USA. But that's not really the issue here is it?

load more comments
view more: next ›