Right. If I had dumped all my money into AI stonks and was overall deeply invested into silicon valley I’d sell the doomer story, too. Keeps the bubble alive.
Meanwhile the REAL threat of the AI hype, i.e. overburdening ecosystems with ridiculously hungry data centers and putting children’s sanity in the hands of a hallucinating sycophantic autofill can be entirely ignored because “AGI BAD SO WE MUST BUILD AGI”.
JollyG@lemmy.world 18 hours ago
AI doomsday marketing wank has the same vibe as preteens at a sleepover getting spooked by a ouija board.
Thedogdrinkscoffee@lemmy.ca 17 hours ago
Ues, but no different than AI competence wank. LLMs are a significant step forward, but not even remotely intelligent. It’s all bullshit and hype.
One day it won’t be. We aren’t there yet.
echodot@feddit.uk 13 hours ago
I don’t think we’re ever going to get there until we get off this idea that LLMs are in some way going to achieve general intelligence.
IllNess@infosec.pub 17 hours ago
givesomefucks@lemmy.world 15 hours ago
No, Y2K would have been catastrophic.
But a shit ton of people out in a ridiculous amount of work and everything was updated.
People concerned were fucking “doomers”, they were the reason shit didn’t go terribly.
But because rational.people helped everyone avoid the consequences, idiots think we could have just ignored it and had the same result.
It’s fucking ridiculous you didn’t learn this lesson from covid even if you’re too young to have experienced Y2K
resipsaloquitur@lemmy.world 14 hours ago
It doesn’t have to be good to be bad.