1 gigawatt is not that insane, and I doubt it’s what the datacenter consumes. A rack can easily get into double, even triple digits kW for GPU heavy setups. So let’s say 10 racks per Megawatt. I’m sure such a datacenter has more than 10.000 racks. Plus A/C, and all other “ancillary” uses. A normal datacenter can get close to 1 GW, this thing might be double digits, but I doubt they will publish exact numbers.
Comment on Wyoming to host massive AI data center using more electricity than all Wyoming homes combined
MysteriousSophon21@lemmy.world 3 days agoYeah that language is pure corporate BS - data centers CONSUME energy at massive scales (up to 1 gigawatt in this case, which is insane), they’re literally just giant heaters that occasionally produce AI outputs as a byproduct of all that wasted electricty.
Tja@programming.dev 2 days ago
bigfondue@lemmy.world 3 days ago
I guess they could say they are generating 1GW of computing power
ITGuyLevi@programming.dev 3 days ago
More like a GW of heat… Thankfully I’m sure that will counteract whatever has caused it to be over 80 degrees on my way to work before 0700.
Tja@programming.dev 2 days ago
Every square kilometer of land (0.38 Sq miles in freedom units) gets about a GW of heat from the sun (depending on latitude). I doubt one datacenter will contribute that much…
ITGuyLevi@programming.dev 2 days ago
I don’t know, I’ve been in some hot places but massive cooling towers tend to radiate a bit more (now I know what I’m reading about today) and a data center without the ability to pump heat outside isn’t going to make it a whole day before it’s toast.
Not necessarily disagreeing, just curious about how much heat is dispersed by the ones here.