Comment on Huawei Zurich Lab’s New Open-Source Tech Lets LLMs Run on Consumer GPUs

wizzor@sopuli.xyz ⁨13⁩ ⁨hours⁩ ago

Little sparse on detail, I regularly run LLMs on 5 year old CPUs so no problem there, I wonder how the approach compares in memory requirements to existing quantization methods.

source
Sort:hotnewtop