when it can come up with a solution it hasn’t seen before.
that’s the threshold.
that’s the threshold for creative problem solving, which isn’t all there is to intelligence, but i think it’s fair to say it’s the most crucial part for a machine intelligence.
hoshikarakitaridia@lemmy.world 1 day ago
This.
I have taught highschool teens about AI between 2018 and 2020.
The issue is we are somewhere between getting better at gambling (statistics, Markov chains, etc.) and human brain simulation (deep neural networks, genetic algorithms).
For many people it’s important how we frame it. Is it random word generator with a good hit rate or is it a very stupid child?
Of course the brain is more advanced - it has way more neurons than an AI model has nodes, it works faster and we have years of “training data”. Also, we can use specific parts of our brains to think, and some things are so innate we don’t even have to think about it, we call them reflexes and they bypass the normal thinking process.
BUT: we’re at the stage where we could technically emulate chunks of a human brain through AI models however primitive they are currently. And in it’s basic function, brains are not really much more advanced than what our AI models already do. Although we do have a specific part for our brain just for languages, which means we get a little cheat code for writing text in comparison to AI, and similar other parts for creative tasks and so on.
So where do you draw the line? Do you need all different parts of a brain perfectly emulated to satisfy the definition of intelligence? Is artificial intelligence a word awarded to less intelligent models or constructs, or is it just as intelligent as human intelligence?
Imo AI sufficiently passes the vibe check on intelligence. Sure it’s not nearly on the scale of a human brain and is missing it’s biological arrangements and some clever evolutionary tricks, but it’s similar enough.
However, I think that’s neither scary nor awesome. It’s just a different potential tool that should help everyone of us. Every time big new discoveries shape our understanding of the world and become a core part of our lives, there’s so much drama. But it’s just a bigger change, nothing more nothing less. A pile of new laws, some cultural shifts and some upgrades for our everyday life. It’s neither heaven nor hell, just the same chunk of rock floating in space soup for another century.
HeyThisIsntTheYMCA@lemmy.world 1 day ago
I dunno, the power requirements would seem to be an ecological catastrophe in the making, except it’s already happening.
Shanmugha@lemmy.world 1 day ago
Well, if we are not looking at all the disaster the hype is doing on so many levels (which is fine in the sense that technology and fools are different things), I draw the line at… intelligence, not simulation of hardware. I care lot less if something before me runs on carbon, metal or, say, sulfur than I care if it is intelligent
And as someone has already pointed out, even defining intelligence is damn hard, and different intelligence works differently (someone who is great at moving their body, like dancers or martial artists, is definitely more intelligent than me in quite a few areas, even if I know math or computers better than tem. So… “artificial intelligence” as a bunch of algorithms (including LLM) etc - no problem with me, “artificial intelligence” as “this thing is thinking” or “this thing is just as good as an artist/doctor/lawyer as a human” - nah, bullshit