Comment on

<- View Parent
percent@infosec.pub ⁨5⁩ ⁨hours⁩ ago

I wouldn’t be surprised if that’s only a temporary problem - if it becomes one at all. People are quickly discovering ways to use LLMs more effectively, and open source models are starting to become competitive with commercial models. If we can continue finding ways to get more out of smaller, open-source models, then maybe we’ll be able to run them on consumer or prosumer-grade hardware.

GPUs and TPUs have also been improving their energy efficiency. There seems to be a big commercial focus on that too, as energy availability is quickly becoming a bottleneck.

source
Sort:hotnewtop