People that can not do Matrix multiplication do not possess the basic concepts of intelligence now? Or is software that can do matrix multiplication intelligent?
Comment on I'm looking for an article showing that LLMs don't know how they work internally
glizzyguzzler@lemmy.blahaj.zone 4 days agoYou can prove it’s not by doing some matrix multiplication and seeing its matrix multiplication. Much easier way to go about it
whaleross@lemmy.world 3 days ago
glizzyguzzler@lemmy.blahaj.zone 1 day ago
So close, LLMs work via matrix multiplication, which is well understood by many meat bags and matrix math can’t think. If a meat bag can’t do matrix math, that’s ok, because the meat bag doesn’t work via matrix multiplication. lol imagine forgetting how to do matrix multiplication and disappearing into a singularity or something
whaleross@lemmy.world 1 day ago
Well, on the other hand. Meat bags can’t really do neuron stuff either, despite that is essential for any meat bag operation. Humans are still here though and so are dogs.
futatorius@lemm.ee 2 days ago
People that can not do Matrix multiplication do not possess the basic concepts of intelligence now?
As a mathematician (at least by education), I think that’s a great definition, yes.
theunknownmuncher@lemmy.world 4 days ago
Yes, neural networks can be implemented with matrix operations. What does that have to do with proving or disproving the ability to reason? You didn’t post a relevant or complete thought
glizzyguzzler@lemmy.blahaj.zone 3 days ago
Improper comparison; an audio file isn’t the basic action data it is the data, the audio codec is the basic action on the data
“An LLM model isn’t really an LLM because it’s just a series of numbers”
But the action of turning the series of numbers into something of value (audio codec for an audio file, matrix math for an LLM) are actions that can be analyzed
And clearly matrix multiplication cannot reason any better than an audio codec algorithm. It’s matrix math, it’s cool we love matrix math. Really big matrix math is really cool and makes real sounding stuff. But it’s just matrix math, that’s how we know it can’t think
theunknownmuncher@lemmy.world 3 days ago
LOL you didn’t really make the point you thought you did. It isn’t an “improper comparison” (it’s called a false equivalency FYI), because there’s isn’t a real distinction between information and this thing you just made up called “basic action on data”, but anyway have it your way:
Your comment is still exactly like saying an audio pipeline isn’t really playing music because it’s actually just doing basic math.
glizzyguzzler@lemmy.blahaj.zone 1 day ago
I was channeling the Interstellar docking computer (“improper contact” in such a sassy voice) ;)
There is a distinction between data and an action you perform on data (matrix maths, codec algorithm, etc.). It’s literally completely different.
An audio codec (not a pipeline) is just actually doing math - just like the workings of an LLM. There’s plenty of work to be done after the audio codec decodes the m4a to get to tunes in your ears. Same for an LLM, sandwiching those matrix multiplications that make the magic happen are layers that crunch the prompts and assemble the tokens you see it spit out.
LLMs can’t think, that’s just the fact of how they work. The problem is that AI companies are happy to describe them in terms that make you think they can think to sell their product! I literally cannot be wrong that LLMs cannot think or reason, there’s no room for debate, it’s settled long ago. AI companies will string the LLMs together and let them chew for a while to try make themselves catch when they’re dropping bullshit. It’s still not thinking and reasoning though. They can be useful tools, but LLMs are just tools not sentient or verging on sentient
DarkDarkHouse@lemmy.sdf.org 3 days ago
Do LLMs not exhibit emergent behaviour? But who am I, a simple skin-bag of chemicals, to really say.
glizzyguzzler@lemmy.blahaj.zone 1 day ago
They do not, and I, a simple skin-bag of chemicals (mostly water tho) do say
BB84@mander.xyz 3 days ago
Can humans think?