Comment on AI Training Slop

<- View Parent
jfrnz@lemm.ee ⁨5⁩ ⁨days⁩ ago

Running a 500W GPU 24/7 for a full year is less than a quarter of the energy consumed by the average automobile in the US (in 2000). I don’t know how many GPUs this person has or how long it took to fine tune the model, but it’s clearly not creating an ecological disaster. Please understand there is a huge difference between the power consumed by companies training cutting-edge models at massive scale/speed, compared to a locally deployed model doing only fine tuning and inferencing.

source
Sort:hotnewtop