This is more expensive in your country?
That’s a little over $11 USD per 100 GB disk. Is it just more expensive where you live, or is it shipping?
I’d be really surprised if these weren’t manufactured in Asia somewhere.
Comment on Incremental backups to optical media: tar, dar, or something else?
traches@sh.itjust.works 5 days agoWhere I live (not the US) I’m seeing closer to $240 per TB for M-disc. My whole archive is just a bit over 2TB, though I’m also including exported jpgs in case I can’t get a working copy of darktable that can render my edits. It’s set to save xmp sidecars on edit so I don’t bother with backing up the database.
I mostly wanted a tool to divide up the images into disk-sized chunks, and to automatically track changes to existing files, such as sidecar edits or new photos. I’m now seeing I can do both of those and still get files directly on the disk, so that’s what I’ll be doing.
I’d be careful with using SSDs for long term, offline storage. I hear they lose data if not powered for a long time. IMO metadata is small enough to just save a new copy when it changes
This is more expensive in your country?
That’s a little over $11 USD per 100 GB disk. Is it just more expensive where you live, or is it shipping?
I’d be really surprised if these weren’t manufactured in Asia somewhere.
My options look like this:
allegro.pl/kategoria/nosniki-blu-ray-257291?m-dis…
Exchange rate is 3.76 PLN to 1 USD, which is actually the best I’ve seen in years
Just out of curiosity, is the product on Amazon, and is it that same price?
Broadly similar from a quick glance: www.amazon.pl/s?k=m-disc+blu+ray
sxan@midwest.social 5 days ago
It’d be more space efficient to store a COW2 of Linux with a minimum desktop and basically only DarkTable on it. The VM format hasn’t changed in decades.
Shoot. A bootable disc containing Linux and the software you need to access the images, and on a separate track, a COW2 image of the same, and on a third, just DarkTable. Best case, you pop in the drive & run DarkTable. Or, you fire up a VM with the images. Worst case, boot into linux. This may be the way I go, although - again - the source images are the important part.
What I meant was, keep the master sidecar on SSD for regular use, and back it up occasionally to a RW disc. Probably with a simply cp -r to a directory with a date. This works for me because my sources don’t change, except to add data, which is usually stored in date directories anyway.
You’re also wanting to archive the exported files, and sometimes those change? Surely, this is much less data? Of you’re like me, I’ll shoot 128xB and end up using a tiny fraction of the shots. I’m not sure what I’d do for that - probably BD-RW. The longevity isn’t great, but it’s by definition mutable data, and in any case the most recent version can be easily enough regenerated as long as I have the sidecar and source image secured.
Burning the sidecar to disk is less about storage and more about backup, because that is mutable. I suppose an append backup snapshot to M-Disc periodically would be boots and suspenders, and frankly the sidecar data is so tiny I could probably append such snapshots to a single disc for years before it all gets used. Although… sidecar data would compress well. Probably simply tgz, then, since it’s always existed, and always will, even if gzip has been superseded by better algorithms.
BTW, I just learned about the b3 hashing algorithm (about which I’m chagrined, because I thought I kept an eye out on the topic of compression and hashing). It’s astonishingly fast - for the verification part, is what I’m suggesting.