Comment on Microsoft’s $440 billion wipeout, and investors angry about OpenAI’s debt, explained

<- View Parent
Wispy2891@lemmy.world ⁨5⁩ ⁨hours⁩ ago

You need to do a custom program if you want to do that. I mean a traditional program where variables are stored properly.

The models have no memory at all, at every question it starts from scratch, so the clients are just “pretending” it has a memory by simply including all previous questions and answers in your last query. You reply “ok”, but the model is getting thousands of words with all the history.

Because each question becomes exponentially expensive, at some point it starts to prune old stuff. It either truncates the content (for example the completely useless meta ai chatbot that WhatsApp forced down the throat loses context after 2-3 questions) or it uses the model itself to have a condensed resume of past interactions, but this is how it hallucinates.

Otherwise it will cost like $1 per question and more

source
Sort:hotnewtop