this post was submitted on 29 May 2025
24 points (96.2% liked)

Linux

7487 readers
575 users here now

A community for everything relating to the GNU/Linux operating system

Also check out:

Original icon base courtesy of [email protected] and The GIMP

founded 2 years ago
MODERATORS
 

I want to have a mirror of my local music collection on my server, and a script that periodically updates the server to, well, mirror my local collection.

But crucially, I want to convert all lossless files to lossy, preferably before uploading them.

That's the one reason why I can't just use git - or so I believe.

I also want locally deleted files to be deleted on the server.

Sometimes I even move files around (I believe in directory structure) and again, git deals with this perfectly. If it weren't for the lossless-to-lossy caveat.

It would be perfect if my script could recognize that just like git does, instead of deleting and reuploading the same file to a different location.

My head is spinning round and round and before I continue messing around with find and scp it's time to ask the community.

I am writing in bash but if some python module could help with it I'm sure I could find my way around it.

TIA


additional info:

  • Not all files in the local collection are lossless. A variety of formats.
  • The purpose of the remote is for listening/streaming with various applications
  • The lossy version is for both reducing upload and download (streaming) bandwidth. On mobile broadband FLAC tends to buffer a lot.
  • The home of the collection (and its origin) is my local machine.
  • The local machine cannot act as a server
top 6 comments
sorted by: hot top controversial new old
[–] [email protected] 19 points 2 days ago* (last edited 2 days ago) (1 children)

Speaking as a data engineer, you’re having trouble because git is the wrong tool for the job. You can make it work if you use git-lfs + custom hooks — but if you choose to go that route, be aware you’re making things unnecessarily hard for yourself.

If you want to make this easy, separate out your concerns:

  1. Versioning: take periodic snapshots of your unconverted files with a binary-friendly diffing tool like restic or borg. Alternatively, ZFS/btrfs snapshots are an excellent way to handle this.
  2. Conversion: keep your original files in their own directory. Set up a small script that searches your directory of original files recursively, passes the files to lame to encode to V0 or V2, and outputs them to a separate directory of lossy mp3 files.
  3. Syncing: use rsync with the --delete flag to copy your lossy files to the server + clear out files you’ve removed locally.
[–] [email protected] 3 points 1 day ago* (last edited 1 day ago)

Thank you, this is very comprehensive! Yes, after some thought git is no good here.

I am coming round to a similar concept now: one job for lossy files - relatively easy with rsync I guess - and another for the lossless.

[–] FizzyOrange 5 points 2 days ago

Yeah just encode the files locally and rsync them to the server. You could even use a Makefile to do the conversation.

[–] mcmodknower 4 points 2 days ago (1 children)

For the conversion to lossy, git smudge/clean filters might work. Afaik git-lfs works with them. In general git-lfs might be something interesting for your use case.

[–] [email protected] 4 points 2 days ago* (last edited 2 days ago)

Wow, TIL, I never heard of git smudge/clean, or git large filesystem I'm not OP, but just amazed about another corner of git I hadn't known about.

[–] [email protected] 2 points 2 days ago

You might be able to achieve what you want with pre-commit git hooks. You might be able to write a pre-commit script that converts your loseless files to lossy before commiting them to the git repo.

https://pre-commit.com/

A quick googling gave me this demo python pre-commit hook guide, but I'm sure you can write it in other languages if you'd like, could probably even use a shell script I would think.

https://dev.to/jalvaradosegura/create-your-own-pre-commit-hook-3kh