I hope you complained all these years when games used “AI” for computer controlled enemies, because otherwise your post would be super awkward
Comment on The Extreme Cost of Training AI Models.
_sideffect@lemmy.world 1 month ago
All a huge waste of money.
This isn’t ai.
It’s a “Smarter Search”
SkaveRat@discuss.tchncs.de 1 month ago
_sideffect@lemmy.world 1 month ago
Lmao, you have no idea what you’re saying.
Keep sucking up to these useless ai companies though, they love it!
SkaveRat@discuss.tchncs.de 1 month ago
sure, bud
ContrarianTrail@lemm.ee 1 month ago
It is AI though. AGI, which is a subcategory of AI and what many people seemingly imagine AI to mean, it’s not—but AI, yes.
bitjunkie@lemmy.world 1 month ago
Something was needed, tradsearch has sucked dick at anything other than finding a wiki article for an extremely broad topic for over a decade. Just make electricity sustainably. 🤷♂️
ZILtoid1991@lemmy.world 1 month ago
Because it got enshittified, with SEO, ads, etc.
django@discuss.tchncs.de 1 month ago
I also think, that search algorithms work fine, as long as noone is actively trying to fill your results with trash.
pennomi@lemmy.world 1 month ago
AI is a broader term than you might realize. Historically even mundane algorithms like A* pathfinding was considered AI.
Turns out people like to constantly redefine artificial intelligence to “whatever a computer can’t quite do yet.”
_sideffect@lemmy.world 1 month ago
No.
What I’m saying is what all these companies are presenting us is a smarter search.
It’s just a tighter grouping of (biased) data that can be searched and retrieved a bit quicker.
If they want to use the term ai, then hell, factory machines from the last century are ai too.
smooth_tea@lemmy.world 1 month ago
How is your intelligence different from being “biased data that can be accessed”?
The fact that something can reason about what it presents to you as information is a form of intelligence. And while this discussion is impossible without defining “reason”, I think we should at least agree that when a machine can explain to you what and why it did what it did, it is a form of reason.
Should we also not define what it means when a person answers a question through reasoning? It’s easy to overestimate the complexity of it because of our personal bias and our ability to fantasize about endless possibilities, but if you break our abilities down, they might be the result of nothing but a large dataset combined with a simple algorithm.
It’s easy to handwave the intelligence of an AI, not because it isn’t intelligent, but because it has no desires, and therefore doesn’t act unless acted upon. It is not easy to jive that concept with the idea that something is alive, which is what we generally require before calling it intelligent.