Comment on Incremental backups to optical media: tar, dar, or something else?

<- View Parent
sxan@midwest.social ⁨17⁩ ⁨hours⁩ ago

The densities I’m seeing on M-Discs - 100GB, $5 per, a couple years ago - seemed acceptable to me. $50 for a TB? How big is your archive? Mine still fits in a 2TB disk.

Copying files directly would work, but my library is real big and that sounds tedious.

I mean, putting it in an archive isn’t going to make it any smaller. Compression on even lossless compressed images doesn’t often help.

And we’re talking about 100GB discs. Is squeezing that last 10MB out of the disk by splitting an image across two disks worth it?

The metadata is a different matter. I’d have to think about how to handle the sidecar data… but that you could almost keep on a DVD-RW, because there’s no way that’s going to be anywhere near as large as the photos themselves. Is your photo editor DB bigger than 4GB?

I never change the originals. When I tag and edit, that information is kept separate from the source images - so I never have multiple versions of pictures, unless I export them for printing, or something, and those are ephemeral and can be re-exported by the editor with the original and the sidecar. Music, and photos, I always keep the originals isolated from the application.

This is good, though; it’s helping me clarify how I want to archive this stuff. Right now mine is just backed up on multiple disks and once in B2, but I’ve been thinking about how to archive for long term storage.

I think in going to go the M-Disc route, with sidecar data on SSD and backed up to BluRay RW. The trick will be letting DarkTable know that the source images are on different media, but I’m pretty sure I saw an option for that. For sure, we’re not the first people to approach this problem.

The whole static binary thing - I’m going that route with an encrypted share for financial and account info, in case I die, but that’s another topic.

source
Sort:hotnewtop