Comment on LLMs develop their own understanding of reality as their language abilities improve

wizardbeard@lemmy.dbzer0.com ⁨2⁩ ⁨months⁩ ago

Wow, that headline is significantly overstated. This has nothing to do with “understanding” or “reality”. Like holy shit this is an absurd leap from what they actually state in the abstract. Thanks OP for including it in the “preview” here.

Given a number of “starting states” and “ending states” of 2D grid based “environments”, an LLM is able to return increasing accurate “between states” as it is trained.

Neat! Not an understanding of reality. At the biggest stretch it is an “understanding” of an incredibly constrained problem space.

source
Sort:hotnewtop