Comment on As electric bills rise, evidence mounts that data centers share blame. States feel pressure to act

<- View Parent
boonhet@sopuli.xyz ⁨1⁩ ⁨week⁩ ago

You’re right about the general idea, but I think you’re even underestimating the scale here.

I don’t think these servers will be doing much on CPU, they’ll be on GPUs. HPE will sell you a 48 rack unit behemoth with 36 Blackwell GB200s for a total of 72 GPUs and 36 CPUs. The CPUs are actually negligible here, but each of the 36 units use a total of 2700 watts (single GPU itself is supposedly 1200 watts so that would make the CPU 300 watts?)

36 * 2.7 = 97.2 kilowatts. You put just a hundred of these in a data center and you’re talking over 10 megawatts once cooling and everything is factored in. So this is what, 100k m^2 of solar panels for 100 racks?

You’d want them to be running most of the time too, idle hardware is just a depreciating asset. Say they run 75% of the time. 0.75 * 10 * 24 * 365 = 65700 MWh which I will not even convert to gigawatt hours to simplify this: The average American household uses about ~11 MWh of electrical energy per year. A single AI-focused data center without even all that many racks uses as much power as ~6000 households. They’re building them all over the country, and in reality I think they’re actually way bigger than what I mentioned.

source
Sort:hotnewtop