Comment on AGI achieved đ¤
outhouseperilous@lemmy.dbzer0.com â¨2⊠â¨days⊠agoIt doesnât know things.
Itâs a statistical model. It cannot synthesize information ir problem solve, only show you a rough average of its library if inputs graphed by proximity to your input.
jsomae@lemmy.ml â¨2⊠â¨days⊠ago
Congrats, youâve discovered reductionism. The human brain also doesnât know things, as itâs composed of electrical synapses made of molecules that obey the laws of physics and direct oneâs mouth to make words in response to signals that come from the ears.
Not saying LLMs donât know things, but your argument as to why they donât know things has no merit.
outhouseperilous@lemmy.dbzer0.com â¨2⊠â¨days⊠ago
Oh, thatâs why everything else you said seemed a bit off.
jsomae@lemmy.ml â¨2⊠â¨days⊠ago
sorry, I only have a regular brain, havenât updated to the metaphysical edition :/