Comment on Windows 12 and the coming AI chip war

<- View Parent
voluble@lemmy.world ⁨10⁩ ⁨months⁩ ago

Not sure if you already know, but - sophisticated large language models can be run locally on basic consumer-grade hardware right now. LM Studio, which I’ve been playing with a bit, is a host for insanely powerful, free, and unrestricted LLMs.

source
Sort:hotnewtop