jfrnz
@jfrnz@lemm.ee
- Comment on AI Training Slop 5 days ago:
The model exists already — abstaining from using it doesn’t make the energy consumption go away. I don’t think it’s reasonable to let historic energy costs drive what you do, else you would never touch a computer.
- Comment on AI Training Slop 5 days ago:
The point is that OP (most probably) didn’t train it — they downloaded a pre-trained model and only did fine-tuning and inference.
- Comment on AI Training Slop 5 days ago:
Running a 500W GPU 24/7 for a full year is less than a quarter of the energy consumed by the average automobile in the US (in 2000). I don’t know how many GPUs this person has or how long it took to fine tune the model, but it’s clearly not creating an ecological disaster. Please understand there is a huge difference between the power consumed by companies training cutting-edge models at massive scale/speed, compared to a locally deployed model doing only fine tuning and inferencing.