Historically AI always got much better. Usually after the field collapsed in an AI winter and several years went by in search for a new technique to then repeat the hype cycle. Tech bros want it to get better without that winter stage though.
Comment on We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
TootSweet@lemmy.world 10 months ago
as it gets better
Bold assumption.
WanderingThoughts@europe.pub 10 months ago
ZILtoid1991@lemmy.world 10 months ago
That’s part of why they installed Donald Trump as the dictator of the United States. The other is the network states.
technocrit@lemmy.dbzer0.com 10 months ago
Historically “AI” still doesn’t exist.
WanderingThoughts@europe.pub 10 months ago
Technically even 1950s computer chess is classified as AI.
frezik@midwest.social 10 months ago
The issue this time around is infrastructure. The current AI Summer depends on massive datacenters with equally massive electrical needs. If companies can’t monetize that enough, they’ll pull the plug and none of this will be available to general public anymore.
theterrasque@infosec.pub 10 months ago
We’ll still have models like deepseek, and (hopefully) discount used server hardware
Jesus_666@lemmy.world 10 months ago
AI usually got better when people realized it wasn’t going to do all it was hyped up for but was useful for a certain set of tasks.
Then it turned from world-changing hotness to super boring tech your washing machine uses to fine-tune its washing program.
frezik@midwest.social 10 months ago
The major thing that killed 1960s/70s AI was the Vietnam War. MIT’s CSAIL was funded heavily by DARPA. When public opinion turned against Vietnam and Congress started shutting off funding, DARPA wasn’t putting money into CSAIL anymore. Congress didn’t create an alternative funding path, so the whole thing dried up.
That lab basically created computing as we know it today. It bore fruit, and many companies owe their success to it.
IsaamoonKHGDT_6143@lemmy.zip 10 months ago
I wish there was an alternate history forum or novel that explores this scenario.
technocrit@lemmy.dbzer0.com 10 months ago
Pretty sure “AI” didn’t exist in the 60s/70s either.
WanderingThoughts@europe.pub 10 months ago
Like the cliché goes: when it works, we don’t call it AI anymore.
technocrit@lemmy.dbzer0.com 10 months ago
The smart move is never calling it “AI” in the first place.
IsaamoonKHGDT_6143@lemmy.zip 10 months ago
Each winter marks the beginning and end of a generation of AI. We are now seeing more progress and as long as there is no technical limit it seems that its progress will not be interrupted.
msage@programming.dev 10 months ago
What progress are we seeing?
Xaphanos@lemmy.world 10 months ago
NVL72 will be enormously impactful on high end performance.
FreedomAdvocate@lemmy.net.au 10 months ago
In what area of AI? Image generation is increasing in Lagos and bounds. Video generation even more so. Image reconstruction for games (DLSS, XeSS, FSR) is having generational improvements almost every year. AI chatbots are getting much much smarter seemingly every month.
What’s one main application of AI that hasn’t improved?
dust_accelerator@discuss.tchncs.de 10 months ago
The spice must flow
Bogasse@lemmy.ml 10 months ago
Yeah, I think there was some efforts, until we found out that adding billions of parameters to a model would allow both to write the useless part in emails that nobody reads and to strip out the useless part in emails that nobody reads.
Melvin_Ferd@lemmy.world 10 months ago
I want my emails to be a series of noises that only computers can hear and communicate with