Comment on I've just created c/Ollama!

brucethemoose@lemmy.world ⁨2⁩ ⁨days⁩ ago

TBH you should fold this into localllama? Or open source AI?

I have very mixed (mostly bad) feelings on ollama. In a nutshell, they’re kinda Twitter attention grabbers that give zero credit/contribution to the underlying framework (llama.cpp). It’s also a highly suboptimal way for most people to run LLMs, especially if you’re willing to tweak.

They’re… slimey. I would always recommend Kobold.cpp, tabbyAPI, ik_llama.cpp, Aphrodite, any number of backends over them. Anything but ollama.

source
Sort:hotnewtop