Comment on I've just created c/Ollama!

<- View Parent
TheHobbyist@lemmy.zip ⁨2⁩ ⁨days⁩ ago

Indeed, Ollama is going a shady route. github.com/ggml-org/llama.cpp/pull/11016#issuecom…

I started playing with Ramalama (the name is a mouthful) and it works great. There is one or two more steps in the setup but I’ve achieved great performance and the project is making good use of standards (OCI, jinja, unmodified llama.cpp, from what I understand).

Go and check it out, they are compatible with models from HF and Ollama too.

github.com/containers/ramalama

source
Sort:hotnewtop