It probably doesn't matter from a popular perception standpoint. The talking point that AI burns massive amounts of coal for each deepfake generated is now deeply ingrained, it'll be brought up regularly for years after it's no longer true.
New memory tech unveiled that reduces AI processing energy requirements by 1,000 times or more
Submitted 1 month ago by obbeel@lemmy.eco.br to technology@lemmy.world
Comments
FaceDeer@fedia.io 1 month ago
GBU_28@lemm.ee 1 month ago
Casual consumers don’t care one bit about this. Companies would, because this would save money
Lettuceeatlettuce@lemmy.ml 1 month ago
Capitalists: So you’re telling me I can build 1000x more AI data center infrastructure now?
recklessengagement@lemmy.world 1 month ago
Possible solution to the Von Neumann bottleneck? Or does this address a different issue
HubertManne@moist.catsweat.com 1 month ago
I hope this is true. ai has its uses but it can't be way more inneficient. It would be great if it answering used no more energy than a standard web query
A_A@lemmy.world 1 month ago
“In this work, a CRAM array based on magnetic tunnel junctions (MTJs) is experimentally demonstrated. First, basic memory operations, as well as 2-, 3-, and 5-input logic operations, are studied. Then, a 1-bit full adder with two different designs is demonstrated.”
www.nature.com/articles/s44335-024-00003-3
So, this is experimentally demonstrated, yet, only at small scale.