Comment on Can local LLMs be as useful and insightful as those widely available?

brucethemoose@lemmy.world ⁨1⁩ ⁨week⁩ ago

Heh, you shouldn’t be paying for LLMs. Gemini 2.5 Pro is free, and so are a bunch of great API models.

I have technical reasons for running local models (instance cached responses, constrained grammar, logprob output, fine tuning), and I can help you set that up if you want, but TBH I am not going into a long technical proof of why that’s advantageous unless you really want to try this all yourself.

source
Sort:hotnewtop