Comment on Windows 12 and the coming AI chip war
voluble@lemmy.world 10 months agoNot sure if you already know, but - sophisticated large language models can be run locally on basic consumer-grade hardware right now. LM Studio, which I’ve been playing with a bit, is a host for insanely powerful, free, and unrestricted LLMs.