this post was submitted on 02 Nov 2023
23 points (92.6% liked)

Sysadmin

7566 readers
1 users here now

A community dedicated to the profession of IT Systems Administration

No generic Lemmy issue posts please! Posts about Lemmy belong in one of these communities:
[email protected]
[email protected]
[email protected]
[email protected]

founded 2 years ago
MODERATORS
 

I'm confused about protecting backups from ransomware. Online, people say that backups are the most critical aspect to recovering from a ransomware attack.

But how do you protect the backups themselves from becoming encrypted too? Is it simply a matter of having totally unique and secure credentials for the backup medium?

Like, if I had a Synology NAS as a backup for my production environment's shared storage, VM backups, etc, hooked up to the network via gigabit, what stops ransomware malware from encrypting that Synology too?

Thanks in advance for the feedback!

all 12 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 1 year ago (1 children)

If your backups are visible from the targeted systems, you are doing it wrong. Done right, a backup utility at most only uses an agent on the systems to be able to contact them to get the data and the backups are not reachable.

Have a look at how BackupPC works, not even an agent, it accesses network shares to get the data:

https://backuppc.github.io/backuppc/

[–] [email protected] 1 points 1 year ago (1 children)

I'll check out backupPC. What is the most common/best practices way to make sure the backup medium isn't accessible from any endpoints on the network?

[–] [email protected] 3 points 1 year ago

Unplug it after the backup.

[–] [email protected] 10 points 1 year ago

Immutable/offline backups. If you backup to local physical media (HDD/tape), physically disconnect/eject it and store it somewhere safe. If you back up to cloud storage (S3, etc), many of them have immutability options. If configured properly nobody (not even you) can delete or modify the backups (within the specified time period).

[–] [email protected] 7 points 1 year ago (1 children)

Backups serve different purposes and if encryption by malware is a threat, you have to do backups differently, as opposed to, for example, hardware failure, where your NAS is a valid approach. To protect against encryption malware, you must make your backups inaccessible. One example are read-only backup media like DVD-ROMs. Another example is to make regular backups on tapes or HDDs and lock them up somewhere. You only take them out after you have wiped all computers that were affected by malware.

[–] [email protected] 2 points 1 year ago

What about simulated air gaps? So a backup system that turns off its own networking abilities once its done with the current backup and only turns its networking back on when it's ready to start backing up again?

[–] [email protected] 5 points 1 year ago

The backups are on a separate system with different credentials. One copy of the backups is sent to online storage that is immutable. You set a retention policy and then you can't delete, overwrite, or change the backups.

[–] [email protected] 5 points 1 year ago

Look into the 3-2-1 strategy. Also: At least one Backup should be taken offline after the backup is done. This might be done via Tapes on a Tapelibary, where you would put your Used tapes into a fireproof safe (certified for Tape fire protection - ask me if you dont know what that means). Those backups that are not connected to a network are most reliable in such a scenario. Most encrypters encrypt right away and thus offline/archived backups are most likely not already affected.

If your trojan was keeping itself silent for a couple of months (some specialised do that) even your archives are at risk. In such a situation mostly the only solution is to build from fresh.

[–] [email protected] 3 points 1 year ago

3-2-1 standard is what saves you.

[–] damium 3 points 1 year ago

If you want an automated system that can protect against ransomware your backups need to be hosted in some way where the backup server has control of the retention and not the client (NAS, local disk, etc are not sufficient). If your NAS supports automated snapshots that can't be deleted by the backup user it can mostly fill this gap but may need to be checked for how it handles snapshots when the disk fills.

For self-hosted solutions I've used BURP, Amanda, and Borg backup in the past but have switched to Proxmox backup server as my VMs all run in Proxmox. You still need to consider full disaster recovery scenarios where both your primary and backup system fail. For this PBS sports both tape and remote server replication.

There are also many cloud solutions that do this automatically. For cloud I would always use them in tandem with some kind of local backup.

For all of these they should have an admin account that has strong protection and doesn't share credentials with any of the primary systems.

[–] [email protected] 1 points 1 year ago

Its actually fairly simple. You just setup a backup server that connects to a network share and reads data.