Comment on Uses for local AI?

<- View Parent
kitnaht@lemmy.world ⁨1⁩ ⁨month⁩ ago

Once the model is trained, the electricity that it uses is trivial. LLMs can run on a local GPU. So you’re completely wrong.

source
Sort:hotnewtop