Comment on CursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requests
I’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
Checkout lm studio lmstudio.ai and you can pair it with vs continue extension docs.continue.dev/getting-started/overview.
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
And009@lemmynsfw.com 3 weeks ago
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
Llak@lemmy.world 3 weeks ago
Checkout lm studio lmstudio.ai and you can pair it with vs continue extension docs.continue.dev/getting-started/overview.
Retro_unlimited@lemmy.world 3 weeks ago
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.