I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
Retro_unlimited@lemmy.world 1 week ago
Llak@lemmy.world 1 week ago
Checkout lm studio lmstudio.ai and you can pair it with vs continue extension docs.continue.dev/getting-started/overview.