LLM’s are not the only type of AI out there. ChatGPT appeared seemingly out of nowhere. Whose to say the next AI system wont do that as well?
peopleproblems@lemmy.world 11 hours ago
10 to 30? Yeah I think it might be a lot longer than that.
Somehow everyone keeps glossing over the fact that you have to have enormous amounts of highly curated data to feed the trainer in order to develop a model.
Curating data for general purposes is incredibly difficult. The big medical research universities have been working on it for at least a decade, and the tools they have developed, while cool, are only useful as tools too a doctor that has learned how to use them. They can speed diagnostics up, they can improve patient outcome. But they cannot replace anything in the medical setting.
The AI we have is like fancy signal processing at best
ContrarianTrail@lemm.ee 9 hours ago
peopleproblems@lemmy.world 3 hours ago
ChatGPT did not appear out of nowhere.
ChatGPT is an LLM that is a generative pre-trained model using a nueral network.
Aka: it’s a chat bot that creates it’s responses based on an insane amount of text data. LLMs trace back to the 90s, and I learned about them in college in the late 2000s-2010s. Natural Language Processing was a big contributor, and Google introduced some powerful nueral network tech in 2014-2017.
The reason they “appeared out of nowhere” to the common man is merely marketing.
ContrarianTrail@lemm.ee 3 hours ago
You’re misquoting me. I haven’t claimed LLMs appeared out of nowhere.
peopleproblems@lemmy.world 2 hours ago
You said ChatGPT appeared out of nowhere. ChatGPT is basically Eliza with an LLM.
vritrahan@lemmy.zip 8 hours ago
Anything can happen. We can discover time travel tomorrow. The economy cannot run on wishful thinking.
lennivelkant@discuss.tchncs.de 7 hours ago
It can! For a while. Isn’t that the nature of speculation and speculative bubbles? Sure, they may pop some day, because we don’t know for sure what’s a bubble and what is a promising market disruption. But a bunch of people make a bunch of money until then, and that’s all that matters.
vritrahan@lemmy.zip 5 hours ago
The uncertainty of it is exactly why it shouldn’t suck up as much capital and resources as it is doing.
knightly@pawb.social 7 hours ago
Tulips all the way down…
RootBeerGuy@discuss.tchncs.de 9 hours ago
Not an expert so I might be wrong, but as far as I understand it, those specialised tools you describe are not even AI. It is all machine learning. Maybe to the end user it doesn’t matter, but people have this idea of an intelligent machine when its more like brute force information feeding into a model system.
RecluseRamble@lemmy.dbzer0.com 8 hours ago
Don’t say AI when you mean AGI.
By definition AI (artificial intelligence) is any algorithm by which a computer system automatically adapts to and learns from its input. That definition also covers conventional algorithms that aren’t even based on neural nets. Machine learning is a subset of that.
AGI (artifical general intelligence) is the thing you see in movies, people project into their LLM responses and what’s driving this bubble. It is the final goal, and means a system being able to perform everything a human can on at least human level. Pretty much all the actual experts agree we’re a far shot from such a system.
BallsandBayonets@lemmings.world 7 hours ago
It may be too late on this front, but don’t say AI when there isn’t any I to it.
Of course it could be successfully argued that humans (or at least a large amount of them) are also missing the I, and are just spitting out the words that are expected of them based on the words that have been ingrained in them.
celliern@lemmy.world 6 hours ago
This is not up to you or me : AI is an area of expertise / a scientific field with a precise definition. Large, but well defined.
frezik@midwest.social 5 hours ago
AI as a field of computer science is mostly about pushing computers to do things they weren’t good at before. Recognizing colored blocks in an image was AI until someone figured out a good way to do it. Playing chess at grandmaster levels was AI until someone figured out how to do it.
Along the way ,it created a lot of really important tools. Things like optimizing compilers, virtual memory, and runtime environments. The way computers work today was built off of a lot of things out of the old MIT CSAIL labs. Saying “there’s no I to this AI” is an insult to their work.
ContrarianTrail@lemm.ee 5 hours ago
A self-driving car is able to observe its surroundings, identify objects and change its behaviour accordingly. Thus a self-driving car is intelligent. What’s driving such car? AI.
You’re free to disagree with how other people define words but then don’t take part in their discussions expecting everyone to agree with your definiton.