Comment on I've just created c/Ollama!

<- View Parent
brucethemoose@lemmy.world ⁨1⁩ ⁨day⁩ ago

I don’t understand.

Ollama is not actually docker, right? It’s running the same llama.cpp engine, it’s just embedded inside the wrapper app, not containerized.

And basically every LLM project ships a docker container. I know for a fact llama.cpp, TabbyAPI, Aphrodite, vllm and sglang do.

You are 100% right about security though, in fact there’s a huge concern with compromised Python packages. This one almost got me: pytorch.org/blog/compromised-nightly-dependency/

source
Sort:hotnewtop