Comment on AI Is Starting to Look Like the Dot Com Bubble
peppersky@feddit.de 1 year agoWhat happens if you take a well done video college course, every subject, and train an AI that’s both good working with people in a teaching frame and is also properly versed on the subject matter. You take the course, in real time you can stop it and ask the AI teacher questions. It helps you, responding exactly to what you ask and then gives you a quick quiz to make sure you understand. What happens when your class doesn’t need to be at a certain time of the day or night, what happens if you don’t need an hour and a half to sit down and consume the data?
You get stupid-ass students because an AI producing word-salad is not capable of critical thinking.
linearchaos@lemmy.world 1 year ago
It would appear to me that you’ve not been exposed to much in the way of current AI content. We’ve moved past the shitty news articles from 5 years ago.
Eccitaze@yiffit.net 1 year ago
Five years ago? Try last month.
Or hell, why not try literally this instant. a screenshot of a Google inquiry asking if any countries in Africa start with the letter K, with an inaccurate response saying that Kenya “starts with a K sound, but is spelled with a K sound.”
linearchaos@lemmy.world 1 year ago
You make it sound like the tech is completely incapable of uttering a legible sentence.
In one article you have people actively trying to fuck with it to make it screw up. And in your other example you picked the most unstable of the new engines out there.
Omg It answered a question wrong once The tech is completely unusable for anything throw it away throw it away.
I hate to say it but this guy’s not falling The tech is still usable and it’s actually the reason why I said we need to have a specialized model to provide the raw data and grade the responses using the general model only for conversation and gathering bullet points for the questions and responses It’s close enough to flawless at that that it’ll be fine with some guardrails.
Eccitaze@yiffit.net 1 year ago
Oh, please. AI does shit like this all the time. Ask it to spell something backwards, it’ll screw up horrifically. Ask it to sort a list of words alphabetically, it’ll give shit out of order. Ask it something outside of its training model, and you’ll get a nonsense response because LLMs are not capable of inference and deductive reasoning.
Having an AI for a teacher is about the stupidest idea I’ve ever heard of, and I’ve heard some really fucking dumb ideas from AI chuds.