Comment on Llama 3.1 is Meta's latest salvo in the battle for AI dominance

<- View Parent
brucethemoose@lemmy.world ⁨2⁩ ⁨months⁩ ago

llama.cpp, the underlying engine, doesn’t support extended RoPE yet. Basically this means long context doesnt work and short context could be messed up too.

I am also hearing rumblings of a messed up chat template?

Basically with any LLM in any UI that uses a GGUF, you have to be very careful of bugs you wouldn’t get in the huggingface-based backends.

source
Sort:hotnewtop