this post was submitted on 19 Jun 2023
1 points (66.7% liked)

Selfhosted

39435 readers
6 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

Hi all

I'm running several docker containers with local persistent volumes that I would like to backup. I haven't found an easy method to do so.

What do you use / recommend to do that? AFAIK you can't just rsync the volume directory while the container is running.

top 5 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 year ago (1 children)

Use bind mounts instead of docker volumes. Then you just have normal directories to back up, the same as you would anything else.

In general, it's not a problem to back up files while the container is running. The exception to this is databases. To have reliable database backups, you need to stop the container (or quiesce/pause the database if it supports it) before backing up the raw database files (including SQLite).

[–] [email protected] 2 points 1 year ago

This is your answer. It also has the benefit of allowing you to have a nice folder structure for your Docker setup, where you have a folder for each service holding the corresponding compose yaml and data folder(s)

[–] [email protected] 4 points 1 year ago

Rsync works fine for most data. (I use borgbackup) For any database data, create a dump using pg_dump or mysqldump or whatever. Then backup the dump and all other volumes but exclude the db volume.

[–] [email protected] 1 points 1 year ago

You can copy data from docker volumes to somewhere on the host node to do the backup from there. You can also have a container using the volumes and from there you can send directly to remote, or map a dorectory on the host node to copy the files to.

If you are running a database or something stateful, look at best practices for backup and then adjust to it.

Or not use volumes and map onto the host node directly. each works, and had its own advantages/disadvantages.

[–] [email protected] 1 points 1 year ago

Bind mounts are easy to maintain and backup. However if you share data amongst multiple container docker volumes are recommend especially for managing state.

Backup volumes:

docker run --rm --volumes-from dbstore -v $(pwd):/backup containername tar cvf /backup/backup.tar /dbdata

  • Launch a new container and mount the volume from the dbstore container
  • Mount a local host directory as /backup
  • Pass a command that tars the contents of the dbdata volume to a backup.tar file inside /backup directory.

docker docs - volumes

Database volume backup without stopping the service: bash into the container, dump it, and copy it out with docker cp. Run it periodically via crontab