I’ve hand calculated forward propagation (neural networks). AI does not learn, its statically optimized. AI “learning” is curve fitting. Human learning requires understanding, which AI is not capable of.
Comment on Judge Rules Training AI on Authors' Books Is Legal But Pirating Them Is Not
Enkimaru@lemmy.world 3 weeks agoYou are obviously not educated on this.
It did not “learn” anymore than a downloaded video ran through a compression algorithm. Just: LoLz.
gaja@lemm.ee 3 weeks ago
nednobbins@lemmy.zip 3 weeks ago
Human learning requires understanding, which AI is not capable of.
How could anyone know this?
Is there some test of understanding that humans can pass and AIs can’t? And if there are humans who can’t pass it, do we consider then unintelligent?
We don’t even need to set the bar that high. Is there some definition of “understanding” that humans meet and AIs don’t?
gaja@lemm.ee 3 weeks ago
It’s literally in the phrase “statically optimized.” This is like arguing for your preferred deity. It’ll never be proven but we have evidence to make our own conclusions. As it is now, AI doesn’t learn or understand the same way humans do.
nednobbins@lemmy.zip 3 weeks ago
So you’re confident that human learning involves “understanding” which is distinct from “statistical optimization”. Is this something you feel in your soul or can you define the difference?
hoppolito@mander.xyz 3 weeks ago
I am not sure what your contention, or gotcha, is with the comment above but they are quite correct. And additionally chose quite an apt example with video compression since in most ways current ‘AI’ effectively functions as a compression algorithm, just for our language corpora instead of video.
nednobbins@lemmy.zip 3 weeks ago
They seem pretty different to me.
Video compression developers go through a lot of effort to make them deterministic. We don’t necessarily care that a particular video stream compresses to a particular bit sequence but we very much care that the resulting decompression gets you as close to the original as possible.
AIs will rarely produce exact replicas of anything. They synthesize outputs from heterogeneous training data. That sounds like learning to me.
The one area where there’s some similarity is dimensionality reduction. Its technically a form of compression, since it makes your files smaller. It would also be an extremely expensive way to get extremely bad compression. It would take orders of magnitude more hardware resources and the images are likely to be unrecognizable.
gaja@lemm.ee 3 weeks ago
Google search results aren’t deterministic but I wouldn’t say it “learns” like a person. Algorithms with pattern detection isn’t the same as human learning.
nednobbins@lemmy.zip 3 weeks ago
You may be correct but we don’t really know how humans learn.
There’s a ton of research on it and a lot of theories but no clear answers.
There’s general agreement that the brain is a bunch of neurons; there are no convincing ideas on how consciousness arises from that mass of neurons.
The brain also has a bunch of chemicals that affect neural processing; there are no convincing ideas on how that gets you consciousness either.
We modeled perceptrons after neurons and we’ve been working to make them more like neurons. They don’t have any obvious capabilities that perceptrons don’t have.
That’s the big problem with any claim that “AI doesn’t do X like a person”; since we don’t know how people do it we can neither verify nor refute that claim.
There’s more to AI than just being non-deterministic. Anything that’s too deterministic definitely isn’t an intelligence though; natural or artificial. Video compression algorithms are definitely very far removed from AI.