Comment on For people self hosting LLMs.. I have a couple docker images I maintain

fhein@lemmy.world ⁨1⁩ ⁨year⁩ ago

Awesome work! Going to try out koboldcpp right away. Currently running llama.cpp in docker on my workstation because it would be such a mess to get cuda toolkit installed natively…

Out of curiosity, isn’t conda a bit redundant in docker since it already is an isolated environment?

source
Sort:hotnewtop