Yeah, I think there was some efforts, until we found out that adding billions of parameters to a model would allow both to write the useless part in emails that nobody reads and to strip out the useless part in emails that nobody reads.
Comment on We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
TootSweet@lemmy.world 2 weeks ago
as it gets better
Bold assumption.
Bogasse@lemmy.ml 2 weeks ago
Melvin_Ferd@lemmy.world 2 weeks ago
I want my emails to be a series of noises that only computers can hear and communicate with
WanderingThoughts@europe.pub 2 weeks ago
Historically AI always got much better. Usually after the field collapsed in an AI winter and several years went by in search for a new technique to then repeat the hype cycle. Tech bros want it to get better without that winter stage though.
Jesus_666@lemmy.world 2 weeks ago
AI usually got better when people realized it wasn’t going to do all it was hyped up for but was useful for a certain set of tasks.
Then it turned from world-changing hotness to super boring tech your washing machine uses to fine-tune its washing program.
WanderingThoughts@europe.pub 2 weeks ago
Like the cliché goes: when it works, we don’t call it AI anymore.
technocrit@lemmy.dbzer0.com 2 weeks ago
The smart move is never calling it “AI” in the first place.
frezik@midwest.social 2 weeks ago
The major thing that killed 1960s/70s AI was the Vietnam War. MIT’s CSAIL was funded heavily by DARPA. When public opinion turned against Vietnam and Congress started shutting off funding, DARPA wasn’t putting money into CSAIL anymore. Congress didn’t create an alternative funding path, so the whole thing dried up.
That lab basically created computing as we know it today. It bore fruit, and many companies owe their success to it.
IsaamoonKHGDT_6143@lemmy.zip 2 weeks ago
I wish there was an alternate history forum or novel that explores this scenario.
technocrit@lemmy.dbzer0.com 2 weeks ago
Pretty sure “AI” didn’t exist in the 60s/70s either.
IsaamoonKHGDT_6143@lemmy.zip 2 weeks ago
Each winter marks the beginning and end of a generation of AI. We are now seeing more progress and as long as there is no technical limit it seems that its progress will not be interrupted.
msage@programming.dev 2 weeks ago
What progress are we seeing?
FreedomAdvocate@lemmy.net.au 2 weeks ago
In what area of AI? Image generation is increasing in Lagos and bounds. Video generation even more so. Image reconstruction for games (DLSS, XeSS, FSR) is having generational improvements almost every year. AI chatbots are getting much much smarter seemingly every month.
What’s one main application of AI that hasn’t improved?
Xaphanos@lemmy.world 2 weeks ago
NVL72 will be enormously impactful on high end performance.
dust_accelerator@discuss.tchncs.de 2 weeks ago
The spice must flow
technocrit@lemmy.dbzer0.com 2 weeks ago
Historically “AI” still doesn’t exist.
WanderingThoughts@europe.pub 2 weeks ago
Technically even 1950s computer chess is classified as AI.
frezik@midwest.social 2 weeks ago
The issue this time around is infrastructure. The current AI Summer depends on massive datacenters with equally massive electrical needs. If companies can’t monetize that enough, they’ll pull the plug and none of this will be available to general public anymore.
theterrasque@infosec.pub 2 weeks ago
We’ll still have models like deepseek, and (hopefully) discount used server hardware
ZILtoid1991@lemmy.world 1 week ago
That’s part of why they installed Donald Trump as the dictator of the United States. The other is the network states.