this post was submitted on 14 Jul 2023
125 points (97.7% liked)

Linux

48334 readers
619 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I don't mean system files, but your personal and work files. I have been using Mint for a few years, I use Timeshift for system backups, but archived my personal files by hand. This got me curious to see what other people use. When you daily drive Linux what are your preferred tools to keep backups? I have thousands of pictures, family movies, documents, personal PDFs, etc. that I don't want to lose. Some are cloud backed but rather haphazardly. I would like to use a more systematic approach and use a tool that is user friendly and easy to setup and program.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 13 points 1 year ago (4 children)

Timeshift is nice to make things easy. I simply use good old-fashioned rsync tied to a cron job.

load more comments (4 replies)
[–] [email protected] 10 points 1 year ago

For personal files I use Borg (with Vorta) and/or Restic

[–] [email protected] 9 points 1 year ago (6 children)

Syncthing. I don't want to invest into a NAS and put some load into my already greedy power bill, so I chose something decentralized. Syncthing really just works like Torrent but for your personal files: Whatever happens on the computer, also does on the phone, and on the laptop. Each have about 1TB of space and 3 times redundancy? Hell yea buddy dig in.

load more comments (6 replies)
[–] [email protected] 8 points 1 year ago

Restic. Borg also great.

[–] [email protected] 8 points 1 year ago

I am using Borg for years. So far, the tool has not let me down. I store the backups on external hard drives that are only used for backups. In addition, I save really important data at rsync.net and at Hetzer in a storage box. Which is not a problem because Borg automatically encrypts locally and for decryption in my case you need a password and a key file.

Generally speaking, you should always test whether you can restore data from a backup. No matter which tool you use. Only then you have a real backup. And an up-to-date backup should always additionally be stored off-site (cloud, at a friend's or relative's house, etc.). Because if the house burns down, the external hard drive with the backups next to the computer is not much use.

By the way, I would advise against using just rsync because, as the name suggests, rsync only synchronizes, so you don't have multiple versions of a file. Which can be useful if you only notice later that a file has become defective at some point.

[–] [email protected] 8 points 1 year ago

Borg Backup (specifically using Vorta front end)

[–] [email protected] 7 points 1 year ago (1 children)

+1 rsync, to an external harddrive. Superfast. Useful also in case I need a backup of a single file that I changed or deleted by mistake. Work files are also backed up to the cloud on mega.nz, which is very useful also for cross-computer sync. But I don't trust personal files to the cloud.

[–] [email protected] 8 points 1 year ago* (last edited 1 year ago) (1 children)

Don't forget that a local backup is as bad as no backup at all in the case of a fire or other disaster. Not trusting the cloud is fine (though strong encryption can make this very safe), but looking into some kind of off site backup is important. Could be as simple as a second hard drive that you swap out weekly stored in a safe deposit, or a nas at a trusted friends house.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

Completely agree! I didn't mention this, but I keep the back-up hard drive in another apartment.

This reminds me of a story that happened in some university in England: they had two backups of some server in two different locations. One day one back-up drive failed, and the second failed the day after. Apparently they were the same brand & model. The moral was: use also different back-up hardware brands or means!

load more comments (1 replies)
[–] [email protected] 7 points 1 year ago (3 children)

Kopia repo on a separate disk dedicated to backups. Have Kopia on my servers as well sending to my local s3 gateway and second copy to wasabi.

load more comments (3 replies)
[–] [email protected] 7 points 1 year ago (1 children)
[–] [email protected] 2 points 1 year ago

I use rsync personally, but for low tech family and especially cross platform backup to network locations, Carbon Copy Cloaner is a nice interface and runs a series of rsyncs under the hood.

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

Restic and borg are the best I’be tried for remote, encrypted backups.

I personally use Restic for my remote backups and rsync for my local.

Restic beats out borg for me because there are a lot more compatible storage options.

[–] [email protected] 4 points 1 year ago

Switched to Restic because then I don’t need any extra software on the server (Synology NAS in my case).

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

I used to be mostly rearic but I've since moved over to Kopia - having the central server on the nas and shipping those files to B2 is easy enough for my level of laziness.

load more comments (1 replies)
[–] [email protected] 5 points 1 year ago

Only syncthing, for me.

[–] [email protected] 5 points 1 year ago (3 children)

I almost never see rdiff-backup in such threads, so I am bringing it up now. Somehow I really like how it works and provides incremental backup with folder structures and file access still accessible directly. Works well enough for me.

[–] [email protected] 3 points 1 year ago (1 children)

I love rdiffbackup.

I use it to backup a 30 TB array and it completes in like 20 minutes if there are no changes.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 5 points 1 year ago (1 children)

My local backups are handled by rdiff-backup to a mirror set of disks. That means my data is versioned but easily accessible for immediate restore, and now on three disks (my SSD, and two rotating rust drives). It also makes restores as simple as copying a file if I want the latest version, or an easy command if I want an older version. And testing backups is as easy as a diff command to compare the backup version with the live version.

Having your files just be files in your backup solution is very handy. At work I don't mind having to use an application like Veeam, because I'm being paid to do that. At home I want to see my backups quickly and easily, because I'd rather be working on my files than wrestling with backup software...

Remote backups are handled by SpiderOak, who have been fine for me for almost a decade. I also use them to synchronise my desktop and laptop computer. On my desktop SpiderOak also backs up some files in an archive area on the rotating rust mirror set - stuff that's large and I don't access often, so don't need to put on my laptop but do want backed up.

I also have a USB thumbdrive that's encrypted and used when I'm travelling to back up changes on my laptop via a simple rsync copy - just in case I have limited internet access and SpiderOak can't do its thing...

I did also have a NAS in the mix once, but I realised that it was a waste of energy - both mine and electricity. In normal circumstances my data is on 5 locations (desktop SSD, laptop SSD, desktop mirror set, SpiderOak's storage) and in the very worst case it's in two locations (laptop SSD, USB thumbdrive). Rdiff-backup to the NAS was simply overkill once I'd added the local mirror set into my desktop, so I retired it.

I'd added the local mirror set because I was working with large files - data sets and VM images - and backups over the network to the NAS were taking an age. A local set of cheap disks in my desktop tower was faster and yet still fairly cheap.

Here's my advice for your consideration:

  • Simple is better than complicated.
  • How you restore is more important than how you backup; perform test restores regularly.
  • Performance matters; backups that take ages are backups you won't run.
  • Look to meet the 3-2-1 criteria; 3 copies, on 2 different storage systems, with at least 1 in a different geographic location. Cloud storage helps with this.

Good luck with your backup strategy!

load more comments (1 replies)
[–] [email protected] 5 points 1 year ago

At work/for business, you can't beat Veeam. It's the gold standard and there is literally nothing better.

At home, Duplicity. Set it up once and then just let it go, and it supports a million different backup targets you can ship your backups off to, including the local filesystem. Has auto-aging/removal rules, easy restores, incrementals, etc. Encrypts by default too.

[–] [email protected] 5 points 1 year ago (1 children)
load more comments (1 replies)
[–] [email protected] 4 points 1 year ago

I like Pika Backup. It's a frontend for borgbackup that also let's you mount and browse your archive with a few clicks. I think it's pretty handy on a desktop PC. And since it uses borgbackup you also get encryption with it.

[–] [email protected] 4 points 1 year ago (5 children)

Git for projects, NAS for 3D printing stuff, mods for games and unofficial game translations, Google Photos for photos (looking to migrate away from that when I have time). I don't much care about anything else.

load more comments (5 replies)
[–] [email protected] 4 points 1 year ago

External harddrive, drag&drop.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

KDE user so for my personal files I backup with both Kups and Bups (install both) and you get the choice of cloning type or only changed files with going back in time choices. Integrates into KDE taskbar/system settings.

For redundancy, I back up my main sync folder on the desktop to my laptop using Syncthing over my WiFi/network.

[–] [email protected] 4 points 1 year ago

I almost never see FreeFileSync mentioned in those threads. It's the only GUI based app I know that also gives you options to not copy file deletions for example. Also has the option to be automated with crontab. Backups are not fragmented or repackaged so you can browse them just fine. Encryption can be done with Veracrypt.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (5 children)

I do 2 backups

Veeam system image daily; this is a fully bootable image of every drive on my system, kept for things like hardware failure or "oops" moments. It just goes to my NAS for fast local storage.

Online backup of important files daily; this has changed a few times, I was using Restic to B2, then Duplicati to Wasabi S3, now I'm using iDrive to see how that is.

My favorite tools are definitely Veeam and Duplicati, because they both have a good UI and are easy to use, both automatically run in the background and handle scheduling entirely on their own. Browsing snapshots is easy and finding the files you want at a specific date/time is quick.

Restic and Kopia I've used as well, they're much harder to use especially for restores, finding files is a nightmare via CLI. Scheduling is a pretty involved step, and you have to figure out how to run them in the background yourself. Both also performed really slowly for me on my ~3TB backup set of about 50k files, compared to Veeam and Duplicati which are very fast.

[–] [email protected] 2 points 1 year ago

+1 for Veeam. I am a backup administrator and this is our tool of choice. I use it for my home machines as well and it works great.

Just remember, you don’t have a backup unless you have tested it.

load more comments (4 replies)
[–] [email protected] 3 points 1 year ago
[–] [email protected] 3 points 1 year ago

GNOME Disk Utility for backing up the whole hard drive. Otherwise, I use BackInTime.

[–] [email protected] 3 points 1 year ago

I use dirvish a text based cron enabled rsync front end. Read dirvish.org for details about it.

I use this to clone and hold time based backups to external disks which I can verify or use offsite.

Rock solid for years.

[–] [email protected] 3 points 1 year ago

Borg backup

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

Borg backup (via Pika Backup (Libadwaita gnome app) frontend) to one of my physical drive and also to borgbase.com (free tier 10 gb free)

[–] [email protected] 3 points 1 year ago

I use timeshift for local backups, then duplicati for backing up to Amazon glacier monthly.

[–] [email protected] 3 points 1 year ago

grsync, its easy to use

[–] [email protected] 3 points 1 year ago

Simply rsync in a crontab.

[–] [email protected] 3 points 1 year ago
[–] [email protected] 2 points 1 year ago

I've used a combination of

  • Managing ZFS snapshots with pyznap
  • Plain old rsync to copy important files that happen not to be on ZFS filesystems to ZFS.

If I were doing this over today, I'd probably consider https://zrepl.github.io/ instead of pyznap, as pyznap is no longer receiving real active development.

In the past I've used rdiff-backup, which is great but it's hard to beat copy-on-write snapshots for speed and being lightweight.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

I use boring old zfs snapshot + zfs send -i.
It's not pretty, but it's reliable.

[–] [email protected] 2 points 1 year ago

Deja Dup backs my local machines to my Synology NAS. That uses Hyper-backup to send everything to Dropbox.

[–] [email protected] 2 points 1 year ago
[–] [email protected] 2 points 1 year ago (2 children)

Well it was duplicati, until it pulled this bullshit on me. I had a critical local failure of my data a month ago, 2.8TB lost. Pulled the backup off AWS S3 with my linux server, asked Duplicati to restore it, and it's failed 4 times for random reasons, taking a week to get there each time. Once I can get this backup to finally restore, I'm moving over to Duplicity.

load more comments (2 replies)
[–] [email protected] 2 points 1 year ago

Duplicity over SSH to my backup NAS, which then backs up to a cloud service iDrive weekly.

My phone and tablet are both Samsung, which uses OneDrive for backups

[–] [email protected] 2 points 1 year ago

I have no relevant data locally. My Documents is a symlink to a Nextcloud directory running on my Synology NAS on a RAID1 that backups to cloud storage via one of their tools (forgot which one).

I never liked having to backup working machines. If it breaks I'm fine with having to install again. I won't lose data though.

[–] [email protected] 2 points 1 year ago

I use back in time. It's served me well for quite a few years.

load more comments
view more: next ›