Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?

Hamartiogonic@sopuli.xyz ⁨1⁩ ⁨month⁩ ago

Remember when you were 10 and had to give a presentation about something? Well, you just memorized some stuff word for word (or simply read off a paper), gave the presentation and hoped nobody had any questions. Well, LLMs did the same, but they memorized several libraries worth of books and half the web.

No matter how much you memorize, it doesn’t change the fact that a 10 year old still doesn’t really understand how water circulates on Earth or how plants make seeds. Sure, you can give a convincing presentation about it, but everything falls apart as soon as anyone asks a question.

With LLMs, the scale is just bigger, but essentially they’re still playing the same game. Instead of having a half page presentation, you can think of LLMs as if they have prepared a 6000 page presentation, of which they only share a half page summary with you. If you ask questions, the LLM will “scroll to the relevant page” and look up the answer. As you keep poking and prodding, you’ll eventually find out that it’s all memorized, and the LLM understood none of it. This house of cards doesn’t fall apart immediately because LLMs have memorized so much.

One obvious difference is that your average 10 years old doesn’t have unshakable confidence in what they memorized, whereas LLMs tend to present everything as hard facts. If you point out a mistake, it’s usually going to be just another exercise in futility.

source
Sort:hotnewtop