Comment on Can local LLMs be as useful and insightful as those widely available?

theotherbelow@lemmynsfw.com ⁨3⁩ ⁨weeks⁩ ago

100% you don’t have to train a thing ollama uses open availability models. They many are decent, the best use a lot of ram/vram.

source
Sort:hotnewtop