A Solid 3-2-1 Backup Strategy
Implementing the 3-2-1 backup rule to safeguard your data and ensure recovery.
Backups are often an afterthought until something goes wrong. When systems run smoothly, it’s easy to assume that data is safe – until a hard drive fails, ransomware strikes, or human error wipes out crucial files. By the time you realize the importance of backups, it’s often too late.
The 3-2-1 Backup Rule provides a simple, effective strategy to minimize risk and ensure that your data can be restored no matter the situation. Let’s break it down and look at how to implement it in a way that fits modern workflows.
3-2-1 Backup Rule Breakdown
Three Copies of Data
- Original Data – The files and databases you work with daily.
- Primary Backup – A full copy of your data stored locally (external drive, NAS, etc.).
- Secondary Backup (Off-site) – Cloud storage or another physical location.
Why Three?
If ransomware encrypts your main drive, or a fire destroys your house, having two additional copies significantly reduces the risk of total data loss.
Use automated backup software like Duplicacy, Veeam, or Backblaze to avoid manual backup headaches.
Two Different Media Types
- Local Drive/NAS – A fast, convenient on-site solution for quick restores.
- Cloud Backup – Services like Wasabi, AWS Glacier, or Google Drive provide scalable, off-site storage.
Why Different Media?
Hard drives can fail. USB sticks get lost. Diversifying media types protects against different types of failure.
Other Media Options:
- External SSDs – Fast, durable, and portable.
- Tape Drives – Still used for large-scale enterprise backups, though rare for home users because it is an extremely expensive solution.
- NAS Devices – Great for local network backups and file sharing.
One Off-site Backup
- Cloud Storage – Back up critical data to cloud providers.
- Physical Off-site – Store a backup drive at a trusted location or safe deposit box.
Why Off-site?
Local backups are useless if disaster strikes your entire home or office. Natural disasters, theft, or electrical fires can wipe out everything in one go. Off-site backups ensure that you can recover even in worst-case scenarios.
For sensitive data, encrypt your off-site backups using VeraCrypt or Cryptomator to add an extra layer of security.
RAID is Not a Backup
Not only a running meme amongst tech communities, many assume that running RAID (Redundant Array of Independent Disks) protects their data. While RAID provides redundancy, it is not a substitute for a backup.
- RAID guards against drive failure – but if files are deleted or corrupted, those changes are mirrored across all disks.
- Backups allow you to restore previous versions of data – something RAID cannot do.
Use RAID alongside proper backups for the best of both worlds – redundancy and versioned restores.
Getting Started with Backups – A Practical Approach
- Local Backup
- Set up a NAS or external SSD with automated backup schedules.
- Use tools like rsync, Restic, or Synology Hyper Backup to mirror data regularly.
- Cloud Backup
- Choose a provider: Backblaze, iDrive, or Wasabi.
- Automate cloud backups for critical folders and databases.
- Versioning
- Enable versioning on your backup software. This lets you restore older snapshots of files, not just the latest version.
- Test Restores
- A backup is useless if it can’t be restored. Test restoring files at least once a quarter to ensure data integrity.
My Backup Setup – A Practical Example
To give you an idea of how the 3-2-1 backup rule works in practice, here’s how I handle backups across my infrastructure:
- Primary Data (Synology 1) – One Synology NAS stores all my personal data, acting as the main hub for daily use.
- Backup & Infrastructure (Synology 2) – A larger Synology backs up the first NAS and also holds infrastructure backups, including VMs, servers, and configurations.
- Off-site Backup (Synology 3) – This Synology sits off-site and backs up both of the other NAS devices using WireGuard VPN for secure remote access and rsync for synchronization.
- Offline SSD – To add an extra layer of redundancy, I regularly perform manual backups to an offline SSD. This ensures that even if ransomware or a severe failure occurs, I have a completely isolated copy of my most important data.
This setup gives me peace of mind, knowing that my data exists in multiple locations and across different types of media. Even if one NAS fails or my home experiences an unexpected disaster, the off-site backup ensures I can recover quickly.
If you're building your own backup solution, I highly recommend adopting a similar multi-layered approach. It doesn’t need to start as elaborate – even basic NAS and cloud backups are a solid start – but the key is to ensure redundancy across different environments.
1. NAS-to-NAS Backups (Local and Remote)
Synology makes NAS-to-NAS backups straightforward with Hyper Backup or rsync. However, if you're like me and prefer more granular control over the process, rsync over WireGuard is the way to go for off-site backups.
Why WireGuard?
- Speed and Efficiency – WireGuard is faster than OpenVPN and has a smaller attack surface.
- Low Overhead – Minimal performance hit, perfect for low-power NAS devices.
- Simplicity – Easier to configure compared to traditional VPN solutions.
WireGuard + rsync Setup (Step-by-step)
On the Primary NAS (Synology 1):
- Install WireGuard via Docker (see my guide here)
On the Off-site NAS (Synology 3):
- Install WireGuard and import the peer config.
- Test the connection to ensure the NAS can route traffic through WireGuard.
- Add the following rsync job for incremental syncing:
rsync -avz --delete /volume1/data/backup/ [email protected]:/volume1/remote-backup/
- Automate the task with a cron job:
crontab -e
0 2 * * * /usr/bin/rsync -avz --delete /volume1/data/backup/ [email protected]:/volume1/remote-backup/
- Test the sync manually to verify it works before relying on automation.
2. Snapshot Replication for VMs and Docker
For infrastructure backups, I use Synology Snapshot Replication to take incremental snapshots of VM images and Docker configurations. Snapshots are faster than full rsync jobs and offer versioning without duplicating entire datasets.
Setup for VM Backups:
- Enable Snapshot Replication from Synology DSM.
- Select the VM storage folder or Docker container directory.
- Schedule hourly snapshots with a 14-day retention policy.
synoservice --restart pkgctl-SnapshotReplication
Retention Strategy:
- Hourly Snapshots (24 hours)
- Daily Snapshots (2 weeks)
- Weekly Snapshots (3 months)
This method ensures that if a VM becomes corrupted, I can roll back to the last known good state within seconds.
3. Encrypting Backup Data with VeraCrypt
For sensitive off-site backups, I encrypt the drive using VeraCrypt before syncing. This adds a layer of security in case the NAS is physically compromised.
- Create an encrypted VeraCrypt volume:
veracrypt --create /volume1/backup/secure.img --size=500G --encryption=AES --filesystem=ext4
- Mount the volume during backup jobs:
veracrypt --mount /volume1/backup/secure.img /mnt/backup
- Rsync into the mounted volume.
This ensures that even if someone accesses the off-site NAS, the data will be inaccessible without the encryption key.
4. Immutable Backups with Btrfs and S3
For critical configurations and personal data, I leverage Btrfs snapshots and sync them to S3-compatible cloud storage (Wasabi).
Why Btrfs?
- Copy-on-write – Only changed data is backed up, reducing overhead.
- Self-healing – Btrfs detects corruption and recovers automatically.
- Native Snapshots – Fast, efficient, and space-saving.
Wasabi Configuration for Synology:
- Install the Hyper Backup package.
- Create a new backup task and select Wasabi (or S3).
- Choose "Enable versioning" to keep older versions of files.
Immutable Option:
- Enable Object Locking to create immutable backups – even if ransomware compromises your NAS, cloud backups remain untouched.
Monitoring Backup Health
No backup is complete without monitoring. I integrate n8n with my backup scripts to send notifications if a backup job fails.
Example:
rsync -avz /volume1/data/backup/ [email protected]:/volume1/remote-backup/ && \
curl -fsS -m 10 --retry 5 -o /dev/null https://n8n.mydomain.com/webhook
If the rsync job fails, I receive an alert within minutes.