Comment on Is possible to run a LLM on a mini-pc like the GMKtec K8 and K9?
entropicdrift@lemmy.sdf.org 4 months agoQuantized with more parameters is generally better than floating point with fewer parameters. If you can squeeze a 14b parameter model down to a 4-bit int quantization it’ll still generally outperform a 16-bit Floating Point 7b parameter equivalent.
TheBigBrother@lemmy.world 4 months ago
Interesting information mate, I’m documenting myself into the subject, thx for the help 👍👍