The ENIAC drew 174 kilowatts and weighed 30 tons.
it’s just getting off the ground
That’s what we’re afraid of, yes.
Comment on AI Computing on Pace to Consume More Energy Than India, Arm Says
crispyflagstones@sh.itjust.works 8 months ago
The ENIAC drew 174 kilowatts and weighed 30 tons. ENIAC drew this 174 kilowatts to achieve a few hundred-few thousand operations per second, while an iPhone 4 can handle 2 billion operations a second and draws maybe 1.5w under heavy load.
Like, yeah, obviously, the tech is inefficient right now, it’s just getting off the ground. Think about this: you think OpenAI likes the idea of paying more money to deliver the same amount of AI services, when in theory they could be paying less to deliver so many services? It’s just silly to assume there will be no efficiency improvements in AI, like the same industry that took us from ENIAC to where we are today is just going to sit on its ass and twiddle its thumbs because they don’t like doing research or making money…?
The ENIAC drew 174 kilowatts and weighed 30 tons.
it’s just getting off the ground
That’s what we’re afraid of, yes.
Yeah, uh huh, efficiency isn’t really a measure of absolute power use, it’s a measure of how much you get done with the power. Let me answer this very quickly: Google, Amazon, Microsoft, and Meta all together could not get anything done as companies if they all had to split an ENIAC (vastly less powerful than an older model iPhone) between them. This is a completely meaningless comparison.
Absolute power consumption does matter, but global power consumption is approximately 160,000 TWh, so the doubling means all the largest cloud providers all together are now using less than 0.05% of all the energy used across the world. And a chunk of that extra 36 TWh is going to their daily operations, not just their AI stuff.
The more context I add in to the picture, the less I’m worried about AI in particular. The overall growth model of our society is the problem, which is going to need to have political/economic solutions. Fixating on a new technology as the culprit is literally just Luddism all over again, and will have exactly as much impact in the long run.
Google, Amazon, Microsoft, and Meta all together could not get anything done as companies
Google’s biggest revenue stream is advertisement
Amazon’s biggest revenue stream is data hosting for national militaries and police forces.
Microsoft’s biggest revenue stream is subscriptions to software that was functionally complete 20 years ago
Meta’s biggest revenue stream is ads again
So 72-TWh of energy spent on Ads, Surveillance, Subscriptions, and Ads.
Absolute power consumption does matter, but global power consumption is approximately 160,000 TWh
If these firms were operating steel foundries or airlines at 72-TWh, I would applaud them for their efficiency. Shame they’re not producing anything of material value.
The more context I add in to the picture, the less I’m worried about AI in particular.
Its not for you to worry about. The decision to rapidly consume cheap energy and potable water is entirely beyond your control. Might as well find a silver lining in the next hurricane.
So 72-TWh of energy spent on Ads, Surveillance, Subscriptions, and Ads.
Capitalism truly does end up with the most efficient distribution of resources
I don’t like these companies for their cooperation/friendly attitude towards nation-states either, but your comments are insipid. AWS has like 2 million businesses as customers. They have 30% marketshare in the cloud space, of course they provide cloud services to cops and militaries. They’re cheap, and one of the biggest providers, period. I can’t find any numbers showing their state contracts outweigh their business contracts.
And, sure, plenty of those business contracts are for businesses that don’t do anything useful, but what you don’t seem to understand is that telecoms is vital to industry and literally always has been. It’s not like there’s a bunch of virtuous factories over here producing tons of steel and airplanes, and a bunch of computers stealing money over there. Those factories and airlines you laud are owned by businesses, who use computers to organize and streamline their operations. Computers are a key part of why any industry is as productive as it is today.
AI, and I don’t so much mean LLM’s and stable diffusion here, even if they are fun and eye-catching algorithms, will also contribute to streamlining operations of those virtuous steel foundries and airlines you approve so heartily of. They’re not counterposed to each other. Researchers are already making use of ML in the sciences to speed up research. That research will be applied in real-world industry. It’s all connected.
Its not for you to worry about. The decision to rapidly consume cheap energy and potable water is entirely beyond your control. Might as well find a silver lining in the next hurricane. By the same token, you shouldn’t worry about it either, and should log off and cower under your bed.
AlotOfReading@lemmy.world 8 months ago
ML is not an ENIAC situation. Computers got more efficient not by doing fewer operations, but by making what they were already doing much more efficient.
The basic operations underlying ML (e.g. matrix multiplication) are already some of the most heavily optimized things around. ML is inefficient because it needs to do a lot of that. The problem is very different.
crispyflagstones@sh.itjust.works 8 months ago
There’s an entire resurgence of research into alternative computing architectures right now, being led by some of the biggest names in computing, because of the limits we’ve hit with the von Neumann architecture as regards ML. I don’t see any reason to assume all of that research is guaranteed to fail.
AlotOfReading@lemmy.world 8 months ago
I’m not assuming it’s going to fail, I’m just saying that the exponential gains seen in early computing are going to be much harder to come by because we’re not starting from the same grossly inefficient place.
As an FYI, most modern computers are modified Harvard architectures, not Von Neumann machines. There are other architectures being explored that are even more exotic, but I’m not aware of any that are massively better on the power side (vs simply being faster). The acceleration approaches that I’m aware of that are more (e.g. analog or optical accelerators) are also totally compatible with traditional Harvard/Von Neumann architectures.
crispyflagstones@sh.itjust.works 7 months ago
And I don’t know that by comparing it to ENIAC I intended to suggest the exponential gains would be identical, but we are currently in a period of exponential gains in AI and it’s not exactly slowing down. It just seems unthoughtful and not very critical to measure the overall efficiency of a technology by its very earliest iterations, when the field it’s based on is moving as fast as AI is.