Comment on Llama 3.1 is Meta's latest salvo in the battle for AI dominance
brucethemoose@lemmy.world 5 months agollama.cpp, the underlying engine, doesn’t support extended RoPE yet. Basically this means long context doesnt work and short context could be messed up too.
I am also hearing rumblings of a messed up chat template?
Basically with any LLM in any UI that uses a GGUF, you have to be very careful of bugs you wouldn’t get in the huggingface-based backends.
FaceDeer@fedia.io 5 months ago
I wouldn't call it a "dud" on that basis. Lots of models come out with lagging support on the various inference engines, it's a fast-movibg field.
brucethemoose@lemmy.world 5 months ago
Yeah, but it leaves a bad initial impression when all the frontends ship it and the users aren’t aware its bugged.