Comment on What's your thoughts on this?
Honytawk@feddit.nl 3 days agoYou really bit the AI phobia.
Even 1000 non-local AI prompts use about the same energy as your microwave.
Datacenters running AI are crunching billions of prompts a second.
Comment on What's your thoughts on this?
Honytawk@feddit.nl 3 days agoYou really bit the AI phobia.
Even 1000 non-local AI prompts use about the same energy as your microwave.
Datacenters running AI are crunching billions of prompts a second.
LaLuzDelSol@lemmy.world 3 days ago
Hang on those aren’t even equivalent units. Equivalent to running your microwave for how long?
Honytawk@lemmy.zip 3 days ago
A single prompt is about 1 - 1.5 W/h.
A microwave is 1000 - 1500 W/h.
It doesn’t matter how long you run either. Most prompt are only seconds.
LaLuzDelSol@lemmy.world 2 days ago
Watts per hour is not a unit of energy or power, do you mean watt-hours? Neither of those numbers seems right if so. And the amount of energy consumed by a prompt will vary wildly based on the size of the model, your hardware, what your prompt is, etc. My point is that, with 2 identical prompts on 2 identical models, one done in a specialized datacenter and one done at home, the one at home will probably use more power because it’s less efficient. Therefore, if we are concerned with how much power AI datacenters are using, switching from datacenters to home computing is clearly not a solution.