My guess would be that using a desktop computer to make the queries and read the results consumes more power than the LLM, at least in the case of quickly answering models.
The expensive part is training a model but usage is most likely not sold at a loss, so it can’t use an unreasonable amount of energy.
Instead of this ridiculous energy argument, we should focus on the fact that AI (and other products that money is thrown at) aren’t actually that useful but companies control the narrative. AI is particularly successful here with every CEO wanting in on it and people afraid it is so good it will end the world.
ThePinkUnicorn@lemdro.id 7 months ago
For training yes, but during operation by this studies measure Deepseek actually has a higher power draw, according to the article. Even models with more efficient programming use insane amounts of electricity
A_norny_mousse@feddit.org 7 months ago
OK I guess I didn’t read far enough but your quote says that Deepseek uses less than Open AI?
ThePinkUnicorn@lemdro.id 7 months ago
Less than Open AI’s o3, but that’s because o3 was estimated to use even more power than GPT 5’s 18 Wh per query.