It probably doesn't matter from a popular perception standpoint. The talking point that AI burns massive amounts of coal for each deepfake generated is now deeply ingrained, it'll be brought up regularly for years after it's no longer true.
New memory tech unveiled that reduces AI processing energy requirements by 1,000 times or more
Submitted 3 months ago by obbeel@lemmy.eco.br to technology@lemmy.world
Comments
FaceDeer@fedia.io 3 months ago
GBU_28@lemm.ee 3 months ago
Casual consumers don’t care one bit about this. Companies would, because this would save money
Lettuceeatlettuce@lemmy.ml 3 months ago
Capitalists: So you’re telling me I can build 1000x more AI data center infrastructure now?
recklessengagement@lemmy.world 3 months ago
Possible solution to the Von Neumann bottleneck? Or does this address a different issue
HubertManne@moist.catsweat.com 3 months ago
I hope this is true. ai has its uses but it can't be way more inneficient. It would be great if it answering used no more energy than a standard web query
A_A@lemmy.world 3 months ago
“In this work, a CRAM array based on magnetic tunnel junctions (MTJs) is experimentally demonstrated. First, basic memory operations, as well as 2-, 3-, and 5-input logic operations, are studied. Then, a 1-bit full adder with two different designs is demonstrated.”
www.nature.com/articles/s44335-024-00003-3
So, this is experimentally demonstrated, yet, only at small scale.