Define, “reasoning”. For decades software developers have been writing code with conditionals. That’s “reasoning.”
LLMs are “reasoning”… They’re just not doing human-like reasoning.
Define, “reasoning”. For decades software developers have been writing code with conditionals. That’s “reasoning.”
LLMs are “reasoning”… They’re just not doing human-like reasoning.
sp3ctr4l@lemmy.dbzer0.com 2 weeks ago
Howabout uh…
The ability to take a previously given set of knowledge, experiences and concepts, and combine then in a consistent, non contradictory manner, to generate hitherto unrealized knowledge, or concepts, and then also be able to verify that those new knowledge and concepts are actually new, and actually valid, or at least be able to propose how one could test whether or not they are valid.
Arguably this is or involves meta-cognition, but that is what I would say… is the difference between what we typically think of as ‘machine reasoning’, and ‘human reasoning’.