Comment on How do you effectively backup your high (20+ TB) local NAS?
danielquinn@lemmy.ca 3 weeks ago
Honestly, I’d buy 6 external 20tb drives and make 2 copies of your data on it (3 drives each) and then leave them somewhere-safe-but-not-at-home. If you have friends or family able to store them, that’d do, but also a safety deposit box is good.
If you want to make frequent updates to your backups, you could patch them into a Raspberry Pi and put it on Tailscale, then just rsync changes every regularly. Of course means that wherever youre storing the backup needs room for such a setup.
I often wonder why there isn’t a sort of collective backup sharing thing going on amongst self hosters. A sort of “I’ll host your backups if you host mine” sort of thing. Better than paying a cloud provider at any rate.
Joelk111@lemmy.world 3 weeks ago
That NAS software company Linus (of Linus Tech Tips) funded has a feature for this planned I think.
An open source standalone implementation would be dope as hell. Sure, it’d mean you’d need to double your NAS capacity (as you’d have to provide enough storage as you use), but that’s way easier than building a second NAS and storing/maintaining it somewhere else or constantly paying for and managing a cloud backup.
WhyJiffie@sh.itjust.works 3 weeks ago
such a system would need a strict time limit for restoration after the catastrophe. Otherwise leeching would be too easy.
Joelk111@lemmy.world 3 weeks ago
That’s an incredibly good point. Bad actors are the worst. Some ideas:
Definitely a difficult problem to solve.
WhyJiffie@sh.itjust.works 3 weeks ago
and also accounting for low bandwidth connections… whats more, some shitty providers even have monthly data caps
yeah, that would be almost a necessary feature. being able to hold on to the backup when you really can’t restore.