Comment on Major shifts at OpenAI spark skepticism about impending AGI timelines

<- View Parent
MentalEdge@sopuli.xyz ⁨3⁩ ⁨months⁩ ago

And to have conversation, behind the scenes, each prompt gets the entire conversation so far tacked on.

The model itself is static, it doesn’t work like a brain that changes in response to stimulus, or form memories.

To converse about something, the entirety of an exchange is fed back into the model all over again each time it needs to produce a response. In fact, this cab happen several times over for each word in a response.

It’s basically an attempt at duct-taping the ability to form memories onto an otherwise static system. It works, but I don’t see how that way of doing it could ever land LLMs in the land of real consciousness.

It basically means these models “think” in frames, but each frame gets exponentially heavier to process, as it has to ingest every frame that came before.

source
Sort:hotnewtop