Comment on LLMs develop their own understanding of reality as their language abilities improve
atrielienz@lemmy.world 2 months agoThat is very deliberately not in the spirit of the question I asked. It’s almost like you’re intent on misunderstanding on purpose just so you can feel like you’re right.
Deceptichum@quokk.au 2 months ago
You asked if it could do I task I wasn’t even capable of doing, and this was your assessment of consciousness.
atrielienz@lemmy.world 2 months ago
No. I asked if it had been given in unclassified un-named species. Not something someone else just discovered and has already parsed information on. And the point is humans can and do do this, have done it for centuries with the right ght training as those systems we use for classification have been dialed in.
Deceptichum@quokk.au 2 months ago
The information from 4 days was not parsed on, that’s why I chose something so recent.
And LLM can be trained to do this. Literally when it looked at the Petrel it did things humans do such as take note of the dark colours common in seabirds, the small size, etc.
atrielienz@lemmy.world 2 months ago
Given nothing at all, could the LLM quantify or develop the tools and systems we use to categorize such species? The spirit of the question is, humans have been able to look at the world around them, using data we gain from our 5 senses and the scientific method to do this. The LLM cannot develop the same information gathering or classification, diagnostic, or scientific method skills in order to do the same. It relies solely on what we provide it and can only operate within those parameters. It does not have senses of its own. That’s the point. Go look up how we have learned to quantify sapience. Because what you’re saying is that you (a small data point out of trillions or more) can’t do a thing a computer can do, so it must be able to think.