this post was submitted on 23 Jun 2024
25 points (96.3% liked)

Selfhosted

39435 readers
3 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

Hello nerds! I'm hosting a lot of things on my home lab using docker compose. I have a private repo in GitHub for the config files. This is working fine for me, but every time I want to make a change I have to push the changes, then ssh to the lab, pull the changes, and run docker compose up. This is of course working fine, but I want to automate it. Does anyone have a similar setup and know of a good tool? I know I could use watchtower to update existing images, but this is more for if I change a setting or add a new service.

I've considered roughly four approaches.

  1. A new container that mounts the whole running directory and the docker socket. It will register a webhook in GitHub to receive notifications when I push to the repo, run git pull and docker up. My worries here are the usual dind gotchas.

  2. Same as 1, but don't mount anything, instead ssh from container to host and run the steps there. This solves any dind issues, but I don't love giving the container an ssh key to the host.

  3. Have a service running on the host outside of docker. This is probably the correct approach, but very annoying since my host is a Synology nas and it doesn't have systemd or anything like that afaik.

  4. Have a GitHub action ssh to the machine and do the steps. Honestly the easiest way but I would prefer to not open ssh to the internet.

Any feedback or tips are much appreciated. I don't feel like any of my options are very good and I feel like I am probably missing something obvious.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 5 months ago (1 children)

Why not host your own git repo (e.g. gitea) so you can do 2 or 4 without opening services outside?

[–] [email protected] 1 points 5 months ago (3 children)

I'd be a bit concerned with having the git repo also be hosted on the machine itself. If the drives break it's all gone. I could of course have two remotes but then pushing changes still becomes a multi step procedure.

[–] [email protected] 6 points 5 months ago (1 children)

Backup mate. Either local or something over the network. When comes to data loss, it will come find you eventually.

[–] [email protected] 2 points 5 months ago (1 children)

I do have nightly off-site backups, that's true. Still, having the git repo be on the same machine doesn't seem right to me.

[–] [email protected] 1 points 5 months ago

You can set up multiple remotes for a repo and push to a local git server and github at the same time

[–] [email protected] 2 points 5 months ago

I world strongly suggest a second device like an RPI with Gitea. There what I have.

I use portainer to pull straight from git and deploy

[–] [email protected] 1 points 5 months ago

I'd be a bit concerned with having the git repo also be hosted on the machine itself.

Please tell me you have a tested backup solution/procedure in place.