JPEG XL in lossless mode actually gives around 50% smaller file sizes than PNG
Comment on JPEG is Dying - And that's a bad thing | 2kliksphilip
ProdigalFrog@slrpnk.net 3 months agoJpeg XL isn’t backwards compatible with existing JPEG renderers. If it was, it’d be a winner.
According to the video, and this article, JPEG XL is backwards compatible with JPEG.
But I’m not sure if that’s all that necessary. JPEG XL was designed to be a full, long term replacement to JPEG. Old JPEG’s compression is very lossy, while JPEG XL, with the same amount of computational power, speed, and size, outclasses it entirely. and PNG is lossless, and thus is not comparable since the file size is so much larger.
JPEG XL, at least from what I’m seeing, does appear to be the best full replacement for JPEG (and it’s not like they can’t co-exist).
AdrianTheFrog@lemmy.world 2 months ago
ProdigalFrog@slrpnk.net 2 months ago
Oh damn, even better than the estimates I found.
reddig33@lemmy.world 3 months ago
It’s only backwards compatible in that it can re-encode existing jpeg content into the newer format without any image loss. Existing browsers and apps can’t render jpegXL without adding a new decoder.
ProdigalFrog@slrpnk.net 3 months ago
Why is that a negative?
reddig33@lemmy.world 3 months ago
xkcd.com/927/
ProdigalFrog@slrpnk.net 3 months ago
The video actually references that comic at the end.
But I don’t see how that applies in your example, since both JPEG and JPEG XL existing in parallel doesn’t really have any downsides, it’d just be nice to have the newer option available. The thrust of the video is that Google is kneecapping JPEG XL in favor of their own format, which is not backwards compatible with JPEG in any capacity.
moonpiedumplings@programming.dev 2 months ago
1000006617
seaQueue@lemmy.world 3 months ago
Legacy client support. Old devices running old browser code can’t support a new format without software updates, and that’s not always possible. Decoding jxl on a 15yo device that’s not upgradable isn’t good UX. Sure, you probably can work around that with slow JavaScript decoding for many but it’ll be slow and processor intensive.
RamblingPanda@lemmynsfw.com 3 months ago
But how is that different to any other new format? Webp was no different?
ProdigalFrog@slrpnk.net 3 months ago
That’s a good argument, and as a fan of permacomputing and reducing e-waste, I must admit I’m fairly swayed by it.
However, are you sure JPEG XL decode/encode is more computationally heavy than JPEG to where it would struggle on older hardware? This measurement seems to show that it’s quite comparable to standard JPEG, unless I’m misunderstanding something.
That wouldn’t help the people stuck on an outdated browser (older, unsupported phones?), but for those who can change their OS, like older PC’s, a modern Linux distro with an updated browser would still allow that old hardware to decode JPEG XL’s fairly well, I would hope.
Morphit@feddit.uk 3 months ago
xkcd.com/927/
Adding more decoders means more overheads in code size, projects dependencies, maintanance, developer bandwidth and higher potential for security vulnerabilities.
SkyeStarfall@lemmy.blahaj.zone 3 months ago
The alternative is to never have anything better, which is not realistic
Yes, it means more code, but that’s an inevitability. We still have lots of legacy stuff, like, say, floppy disk drivers
moonpiedumplings@programming.dev 2 months ago
1000006617
JackbyDev@programming.dev 3 months ago
They’re confusing backwards and forwards compatible. The new file format is backwards compatible but the old renderers are not forward compatible with the new format.