Comment on Custom remote backup

<- View Parent
Cyber@feddit.uk ⁨1⁩ ⁨week⁩ ago

It depends on the sync / backup software

Syncthing uses a stored list of hashes (which is why it takes a long time for the initial scan), then it can monitor filesystem activity for changes to know what to sync.

Rsync compares all source and destination files with some magical high speed algorithm

Then, backup software does… whatever.

Back in the day on FAT filesystems they used the archive bit on each file’s metadata, which was (IIRC) set during a backup and reset with any writes to that file. The next backup could then just backup those files.

Your current strategy is ok - just doing an offline backup after a bulk update, maybe it’s just making that more robust by automating it…?

I suspect you have quite a large archive as photos don’t compress well, and +2TBs won’t disappear with dedupe… so, it’s mostly about long term archival rather than highly dynamic data changes.

So that +2TB… do you drop those files in amongst everything else, or do you have 2 separate locations ie, “My Photos” + “To Be Organised”?

Maybe only backup “MyPhotos” once a year / quarter (for example), but fully sync “To Be Organised”… then you’ve reduced risk, and volume of backup data…?

source
Sort:hotnewtop