Comment on A generation taught not to think: AI in the classroom
Modern_medicine_isnt@lemmy.world 4 weeks agoThat sounds like a form of prejudice. I mean even Siri and Alexa? I don’t use them for different reaons… but a lot of people use them as voice activated controls for lights, music, and such. I can’t see how they are different from the clapper. As for the llms… they don’t do any critical thinking, so noone is offloading thier critical thinking to them. If anything, using them requires more critical thinking because everyone who has ever used them knows how often they are flat out wrong.
jpreston2005@lemmy.world 4 weeks ago
voice activated light switches that constantly spy on you, harvesting your data for 3rd parties?
Claiming that using ai requires more critical thinking than not is a wild take, bro. Gonna have to disagree with all of what you said hard.
Tollana1234567@lemmy.today 3 weeks ago
and recording your conversations, even when your not even asking alexa to do anything.
Modern_medicine_isnt@lemmy.world 4 weeks ago
You hit on why I don’t use them. But some people don’t care about that for a variety of reasons. Doesn’t make them less than.
Anyone who tries to use AI and not apply critical thinking fails at thier task because AI is just wrong often. So they either stop using it, or they apply critical thinking to figure out when the results are usable. But we don’t have to agree on that.
jpreston2005@lemmy.world 3 weeks ago
I don’t think using an inaccurate tool gives you extra insight into anything. If I asked you to measure the size of objects around your house, and gave you a tape measure that was not correctly metered, would that make you better at measuring things? We learn by asking questions and getting answers. If the answers given are wrong, then you haven’t learned anything. It, in fact, makes you dumber.
People who rely on ai are dumber, because using the tool makes them dumber. QED?
Modern_medicine_isnt@lemmy.world 3 weeks ago
How about this. I think it is pretty well known that pilots and astronauts are trained on simulations where some of the information they get from “tools” or gauges is wrong. On the surface it is just simulating failures. But the larger purpose is to improve critical thinking. They are trained to take each peice of information into context and if it doesn’t fit, question it. Sound familiar?
AI spits out lots of information with every response. Much of it will be accurate. But sometimes there will be a faulty basis in it that causes one or more parts of the information to be wrong. But the wrongness almost always follows a pattern. In context the information is usually obviously wrong. And if you learn to spot the faulty basis, you can even sus out which information is still good. Or you can just tell it where it went wrong and it often will come back with the correct answer.
Talking to people isn’t all that different. There is a whole sub for confidently wrong on reddit. But spotting when a person is wrong is often harder because the depth of thier faulty basis can be soo much deeper than an AIs. And, they are people, so you pften can’t politely question the accuracy of what they are saying. Or they are just a podcast… I think you get where I am going.
BananaIsABerry@lemmy.zip 3 weeks ago
If AI has a significant amount of incapabilities and is often wrong (which it definitely is), wouldn’t it take more critical thinking to determine when it’s done something wrong?
jpreston2005@lemmy.world 3 weeks ago
If I were to give you a calculator that was programmed to give the wrong answers, would that be a useful tool? Would you be better off for having used it?
bold_atlas@lemmy.world 3 weeks ago
AI is literally the “Calcucorn” from Tim Heidecker’s “Tom Goes to the Mayor.”
BananaIsABerry@lemmy.zip 3 weeks ago
Does a calculator do a significant amount of statistical analysis and base its output on the most probable result from a massive data set?
No. That would be stupid.
People taking the response from LLMs at face value is a problem, which is the point of the discussion, but disregarding it entirely would be equally dumb. Critical thinking would include knowing when and where to use a specific tool instead of trying to force one to be universal.