Its not a bit deal if you aren’t completely stupid, I don’t use LLMs to learn topics I know nothing about, but I do use them to assist me in figuring out solutions to things I’m somewhat familiar with. In my case I find it easy catch incorrect info, and even if I don’t catch it most of the time if you just occasionally tell it to double check what it said it self corrects.
Comment on AI is rotting your brain and making you stupid
sem@lemmy.blahaj.zone 3 days agoThe problem is if it’s wrong, you have no way to know without double checking everything it says
Zetta@mander.xyz 2 days ago
Nalivai@lemmy.world 2 days ago
It is a big deal. There is thr whole set of ways humans can gauge validity of the info, that are perpendicular to the way we interact with fancy autocomplete.
Every single word might be false, with no pattern to it. So if you can and do check it, you just wasting your time and humanity’s resources instead of finding the info yourself in the first place. If you don’t, or if you think you do, it’s even worse, you are being fed lies and believe them extra hard.
SpicyColdFartChamber@lemm.ee 3 days ago
I understand that. I am careful to not use it as my main teaching source, rather a supplement. It’s helps when I want to dive into the root cause of something, which I then double check with real sources.
Nalivai@lemmy.world 2 days ago
But like why not go to the real sorces directly in the first place? Why add unnecessary layer that doesn’t really add anything?
SpicyColdFartChamber@lemm.ee 2 days ago
I do go to the real source first. But sometimes, I just need a very simple explanation before I can dive deep into the topic.
My brain sucks, I give up very easily if I don’t understand something. (This has been true since way before short form content and internet)
Grimtuck@lemmy.world 3 days ago
Too be fair, this can also be said of teachers. It’s important to recognise that AI’s are as accurate as any single source and should always check everything yourself. I have concerns over a future where our only available sources are through AI.
Nalivai@lemmy.world 2 days ago
The level of psychopathy required from a human to be as blatant at lying as an llm is almost unachievable
Jakeroxs@sh.itjust.works 2 days ago
Bruh so much of our lives is made up of people lying, either intentionally or unintentionally via spreading misinformation.
I remember being in 5th grade and my science teacher in a public school was teaching the “theory” of evolution but then she mentioned there are “other theories like intelligent design”
She wasn’t doing it to be malicious, just a brainwashed idiot.
Nalivai@lemmy.world 2 days ago
And that’s why we, as humans, know how to look for signs of this in other humans. This is the skill we have to learn precisely because of that. Not only it’s not applicable when you read the generated bullshit, it actually does the opposite.
Some people are mistaken, some people are actively misleading, almost no one has the combination of being wrong just enough, and confident just enough, to sneak their bullshit under the bullshit detector.