Comment on Is possible to run a LLM on a mini-pc like the GMKtec K8 and K9?

<- View Parent
entropicdrift@lemmy.sdf.org ⁨4⁩ ⁨months⁩ ago

Quantized with more parameters is generally better than floating point with fewer parameters. If you can squeeze a 14b parameter model down to a 4-bit int quantization it’ll still generally outperform a 16-bit Floating Point 7b parameter equivalent.

source
Sort:hotnewtop