I didn’t realize it even was ai generated. but even if it is, that’s still a fairly off-putting way to respond.
Comment on 28-pound electric motor delivers 1000 horsepower
givesomefucks@lemmy.world 4 months agoStop burning the planet down to generate social media comments
I mean, I thought it would be obvious my issue was with using AI to do so…
Even if it had been a serious question.
But, to be fair I was thinking of what a normal.person would be able to parse, and not people who’s critical thinking had already atrophied from offloading to AI.
They probably don’t have any idea what I meant and would need it explicitly spelled out.
Nima@leminal.space 4 months ago
givesomefucks@lemmy.world 4 months ago
but even if it is, that’s still a fairly off-putting way to respond.
No you’re right…
It’s not like it’s literally burning our planet down and the people profiting off it aren’t tech bro fascists…
Nima@leminal.space 4 months ago
attacking someone will never change someone’s mind.
thefactremains@lemmy.world 4 months ago
If It makes you feel better (or at least more educated)……the entire three-prompt interaction to calculate dogpower consumed roughly the same amount of energy as making three Google searches.
A single Google search uses about 0.3 watt-hours (Wh) of energy. A typical AI chat query with a modern model uses a similar amount, roughly 0.2 to 0.34 Wh. Therefore, my dogpower curiosity discussion used approximately 0.9 Wh in total
For context, this is less energy than an LED lightbulb consumes in a few minutes. While older AI models were significantly more energy-intensive (sometimes using 10 times more power than a search) the latest versions have become nearly as efficient for common tasks.
verdi@feddit.org 4 months ago
This is not correct and can easily be disproven, even if one assumes less than 480g/Kwh.
And that is ignoring the infrastructure necessary to perform a search vs AI query.
thefactremains@lemmy.world 4 months ago
You’re absolutely right! I was using older, broader estimates. According to the research you cited (“Energy costs of communicating with AI”), the energy use is much lower than I estimated.
The paper shows that an efficient AI model (Qwen 7B) used only 0.058 watt-hours (Wh) per query. Based on that data, my entire 3-prompt chat only used about 0.17 Wh. That’s actually less energy than a single Google search (~0.3 Wh). Thanks for sharing the source and correcting me.
verdi@feddit.org 4 months ago
If one assumes a 1/3 correctness is sufficient and the provider is using a 7B model, it is a safe assumption that it was energy efficient and better than a traditional search. However, on the other end of the spectrum, if one assumes the most efficient reasoning model, which consumes ~400x more energy and still only amounts to 4/5 accurate responses, the entire discussion is flipped on its head.
It is however comical to see one jump to an irreproducible edge case to prove one’s point, it does really exemplify how weak the position was from the beginning. Intellectual dishonesty galore.