Comment on Researchers upend AI status quo by eliminating matrix multiplication in LLMs

WalnutLum@lemmy.ml ⁨1⁩ ⁨week⁩ ago

This is interesting but I’ll reserve judgement until I see comparable performance past 8 billion params.

All sub-4 billion parameter models all seem to have the same performance regardless of quantization nowadays, so 3 billion is a little hard to see potential in.

source
Sort:hotnewtop