Don’t do this. It’s a god damn nightmare to delete
Comment on How do you effectively backup your high (20+ TB) local NAS?
unit327@lemmy.zip 2 days ago
I use aws s3 deep archive storage class, $0.001 per GB per month. But your upload bandwidth really matters in this case, I only have a subset of the most important things backed up this way otherwise it would take months just to upload a single backup.
I have complicated system where:
- borgmatic backups happen daily, locally
- those backups are stored on a btrfs subvolume
- a python script will make a read-only snapshot of that volume once a week
- the snapshot is synced to s3 using rclone with --checksum --no-update-modtime
- once the upload is complete the btrfs snapshot is deleted
I’ve also set up encryption in rclone so that all the data is encrypted an unreadable by aws.
quick_snail@feddit.nl 2 days ago
unit327@lemmy.zip 1 day ago
How so? I can easily just delete the whole s3 bucket.
quick_snail@feddit.nl 1 day ago
Maybe I’m thinking of glacier. It took months trying to delete that.
CucumberFetish@lemmy.dbzer0.com 1 day ago
It is cheap as long as you don’t need to restore your data. Downloading data from S3 costs a lot. OP asked about 56TB of storage, for which data retrieval would cost about 4.7k
aws.amazon.com/s3/pricing/ under data transfer
unit327@lemmy.zip 1 day ago
I’m aware, but I myself have < 3TB and if I actually need it I’ll be more happy to pay. It’s my “backup of last resort”, I keep other backups on site and infrequently on a portable HDD offsite.