this post was submitted on 05 Sep 2024
40 points (100.0% liked)

Selfhosted

39435 readers
3 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

So, I'm selfhosting immich, the issue is we tend to take a lot of pictures of the same scene/thing to later pick the best, and well, we can have 5~10 photos which are basically duplicates but not quite.
Some duplicate finding programs put those images at 95% or more similarity.

I'm wondering if there's any way, probably at file system level, for the same images to be compressed together.
Maybe deduplication?
Have any of you guys handled a similar situation?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 2 months ago (2 children)

I'm not saying to delete, I'm saying for the file system to save space by something similar to deduping.
If I understand correctly, deduping works by using the same data blocks for similar files, so there's no actual data loss.

[–] [email protected] 6 points 2 months ago* (last edited 2 months ago)

I believe this is what some compression algorithms do if you were to compress the similar photos into a single archive. It sounds like that's what you want (e.g. archive each day), for immich to cache the thumbnails, and only decompress them if you view the full resolution. Maybe test some algorithms like zstd against a group of similar photos vs individually?

FYI file system deduplication works based on file content hash. Only exact 1:1 binary content duplicates share the same hash.

Also, modern image and video encoding algorithms are already the most heavily optimized that computer scientists can currently achieve with consumer hardware, which is why compressing a jpg or mp4 offers negligible savings, and sometimes even increases the file size.

[–] [email protected] 1 points 2 months ago

I don't think there's anything commercially available that can do it.

However, as an experiment, you could:

  • Get a group of photos from a burst shot
  • Encode them as individual frames using a modern video codec using, eg VLC.
  • See what kind of file size you get with the resulting video output.
  • See what artifacts are introduced when you play with encoder settings.

You could probably/eventually script this kind of operation if you have software that can automatically identify and group images.