Restic (local repo) which I sync onto a Hetzner Storagebox using rclone.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
rsync (laptop -> external HDD, workstation -> dedicated backup HDD)
Syncthing (laptop <-> desktop)
i simply use freefilesync
@dustyData I have hundreds of thousands of files that need to be backed up locally and in the cloud. I use either Vorta or Pika. Both are interfaces for Borg. Easy to use and their deduplication feature manages to save a lot of diskspace. I tried so many backup solutions and none worked as reliably.
I've used a combination of
- Managing ZFS snapshots with pyznap
- Plain old rsync to copy important files that happen not to be on ZFS filesystems to ZFS.
If I were doing this over today, I'd probably consider https://zrepl.github.io/ instead of pyznap, as pyznap is no longer receiving real active development.
In the past I've used rdiff-backup, which is great but it's hard to beat copy-on-write snapshots for speed and being lightweight.
rsnapshot
I use boring old zfs snapshot
+ zfs send -i
.
It's not pretty, but it's reliable.
Deja Dup backs my local machines to my Synology NAS. That uses Hyper-backup to send everything to Dropbox.
I use Pika and Timeshift.
Deva dup
Well it was duplicati, until it pulled this bullshit on me. I had a critical local failure of my data a month ago, 2.8TB lost. Pulled the backup off AWS S3 with my linux server, asked Duplicati to restore it, and it's failed 4 times for random reasons, taking a week to get there each time. Once I can get this backup to finally restore, I'm moving over to Duplicity.
Stuff like that is why I ditched duplicati. I had to rebuild the local db that would randomly corrupt itself one too many times.
Exactly where my failure is. It's corrupting mid-way through the rebuild for no apparent reason.
Duplicity over SSH to my backup NAS, which then backs up to a cloud service iDrive weekly.
My phone and tablet are both Samsung, which uses OneDrive for backups
I just map my entire documents, pictures and other important home folders to subfolders inside Dropbox. This propagates all of my files across all of my computers via the cloud and makes everything accessible from my phone as well.
I don't worry about backing up my operating system, though important configuration file locations are also mapped into Dropbox for easily setting things up again. Complete portable apps are also located in Dropbox.
If only restic deduplicated... But other than that it does okay.
restic does do deduplication.
Truenas on a inexpensive server with RAID. I have several computers in different rooms in the house I like to make music on, and on these pc's my network drives all have the same drive letters for the sample libraries, recordings, projects, and backup. So my projects can run from any computer without missing files. I always save locally and on the Truenas.
Time shift with rsync, and on occasion I clonezilla the drive and save it to my NAS.
clonzilla, redorescue.
I’ve recently started using proxmox -backup-client. Works well. Goes to my backup server along with my vm image backups. Works nicely with full deducing and such. Quite good savings if you are backing up multiple machines.
I the. Rsync this up to cloud once a day.
I have no relevant data locally. My Documents is a symlink to a Nextcloud directory running on my Synology NAS on a RAID1 that backups to cloud storage via one of their tools (forgot which one).
I never liked having to backup working machines. If it breaks I'm fine with having to install again. I won't lose data though.
At this moment I use too many tools.
For user data on my PC and on home server I mostly use Duplicacy. It is fast and efficient. All data backed up locally on NAS box over SFTP, and a subset of that data is backed up to S3 cloud storage.
I have a Mac, this one is using TimeMachine, storing data on NAS, then it's synced to S3 cloud storage one a day.
And on top of that VMs and containers from home server are backed up by Proxmox built in tool to NAS. These mostly exclude user data.
An external hard drive works 100%. And relying on .dotfiles to redownload the whole thing back.
...I mean, it takes like less than 3 minutes to redownload and 5 reconfiguring everything manually, so eh.
Restic in the homelab and Veeam at work. I’m pretty happy with both!
I s3 sync everything to a versioned S3 bucket out on the internets.
What kind of cost is that?
I just use MegaSync, which backsup my config folder and documents folder.
On phone, I use syncthing to backup to home server (I never knew syncthing can backup over WAN), then synced to MegaSync. I also keep all the files on MegaSync on my server just in case megasync suddenly goes down one day.