RedGreenBlue@lemmy.zip 3 weeks ago
For the things AI is good at, like reading documentation, one should just get a local model and be done.
I think pouring as much money as big companies in the us has been doing is unwise. But when you have deep pockets, i guess you can afford to gamble.
SavageCoconut@lemmy.world 3 weeks ago
Could you point me to a model to do that and instructions on how get it up and running?
FauxLiving@lemmy.world 3 weeks ago
I’m using Deepseek R1 (8B) and Gemma 3 (12B), installed using LM Studio (which pulls directly from Hugging Face).
Cethin@lemmy.zip 3 weeks ago
As the other comment says, LM Studio is probably the easiest tool. Once you’ve got it installed it’s trivial to add new models. Try some out and see what works best for you. Your hardware will be a limit on what you can run though, so keep that in mind.
null_dot@lemmy.dbzer0.com 3 weeks ago
I dont have the hardware so I’m using “open web ui” to run queries on models accessible via huggingface API.
Works really well. I haven’t invested the time to understand how to use workspaces, which allow you to tune models, but aparently its doable.