I remember when “heuristics” were all the rage. Frankly that’s what LLMs are, advanced heuristics. “Intelligence” is nothing more than marketing bingo.
Comment on YSK that "AI" in itself is highly unspecific term
Sonotsugipaa@lemmy.dbzer0.com 4 days ago
In defense of people who say LLMs are not intelligent: they probably mean to say they are not sapient, and I think they’re loosely correct if you consider the literal word “intelligent” to have a different meaning from the denotative “Intelligence” in the context of Artificial Intelligence.
ramble81@lemmy.zip 4 days ago
ArgumentativeMonotheist@lemmy.world 4 days ago
‘Intelligence’ requires understanding, understanding requires awareness. This is not seen in anything called “AI”, not today at least, but maybe not ever. Again, why not use a different word, one that actually applies to these advanced calculators? Expecting the best out of humanity, it may be because of the added pizzazz or simple semantic confusion… but seeing the people behind it all, it probably is so the dummies get overly excited and buy stuff/make these bots integral parts of their lives. 🤷
Sonotsugipaa@lemmy.dbzer0.com 4 days ago
The term “Artificial Intelligence” has been around for a long time, 25 years ago AI was an acceptable name for NPC logic in videogames. Arguably that’s still the case, and personally I vastly prefer “Artificial Intelligence” to “Broad Simulation Of Common Sense Powered By Von Neumann Machines”.
XeroxCool@lemmy.world 4 days ago
The overuse (and overtrust) of LLMs has made me feel ashamed to reference video game NPCs as AI and I hate it. There was nothing wrong with it. We all understood the ability of the AI to be limited to specific functions. I loved when Forza Horizon introduced “drivatar” AI personalities of actual players, resembling their actual activities. Now it’s a vomit term for shady search engines and confused visualizers.
Sonotsugipaa@lemmy.dbzer0.com 4 days ago
I don’t share the feeling. I’ll gladly tie a M$ shareholder to a chair, force them to watch me play Perfect Dark, and say “man I love these AI settings, I wish they made AI like they used to”.
Perspectivist@feddit.uk 4 days ago
“Understanding requires awareness” isn’t some settled fact - it’s just something you’ve asserted. There’s plenty of debate around what understanding even is, especially in AI, and awareness or consciousness is not a prerequisite in most definitions. Systems can model, translate, infer, and apply concepts without being “aware” of anything - just like humans often do things without conscious thought.
You don’t need to be self-aware to understand that a sentence is grammatically incorrect or that one molecule binds better than another. It’s fine to critique the hype around AI - a lot of it is overblown - but slipping in homemade definitions like that just muddies the waters.
ArgumentativeMonotheist@lemmy.world 4 days ago
Do you think “AI” KNOWS/UNDERSTANDS what a grammatically incorrect sentence is or what molecules even are? How?
Perspectivist@feddit.uk 4 days ago
You’re moving the goalposts. First you claimed understanding requires awareness, now you’re asking whether an AI knows what a molecule is - as if that’s even the standard for functional intelligence.
No, AI doesn’t “know” things the way a human does. But it can still reliably identify ungrammatical sentences or predict molecular interactions based on training data. If your definition of “understanding” requires some kind of inner experience or conscious grasp of meaning, then fine. But that’s a philosophical stance, not a technical one.
The point is: you don’t need subjective awareness to model relationships in data and produce useful results. That’s what modern AI does, and that’s enough to call it intelligent in the functional sense - whether or not it “knows” anything in the way you’d like it to.
BB84@mander.xyz 4 days ago
Do most humans understand what molecules are? How?