Comment on Algorithm based on LLMs doubles lossless data compression rates

futatorius@lemm.ee ⁨22⁩ ⁨hours⁩ ago

Where I work, we’ve been looking into data compression that’s optimized by a ML system. We have a shit-ton of parameters, and the ML algorithm compares the number of sig figs in each parameter to its byte size, and truncates where that doesn’t cause any loss of fidelity. So far, it looks promising, really good compression factor, but we still need to do more work on de-skilling the decompression at the receiving end.

I wouldn’t have thought LLM was the right technology to use for something like this.

source
Sort:hotnewtop