Comment on Yann LeCun just raised $1bn to prove the AI industry has got it wrong
Not_mikey@lemmy.dbzer0.com 1 day ago
What exactly is this for? I understand LLMs have there limits with understanding physical reality, but at least they have a use case of theoretically automating the “symbolic work” ie moving symbols around on a screen or piece of paper, that white collar workers do.
Yes it’ll never be able to cook a meal or change a lightbulb, but neither will this without a significant enhancement in robotics to embody this AI. What’s the use case? Being able to better tell you how to throw a ball then a person?
SuspciousCarrot78@lemmy.world 1 day ago
World models aren’t just for robotics (though they definitely WILL be used for that). They’re for reasoning under uncertainty in domains where you can’t see the outcome. Eg:
Medical diagnosis: you can’t physically “embody” whether a treatment will work. But a system that understands disease progression, drug interactions, and physiological constraints (not by pattern-matching text, but by learning causal structure) - well, that’s fundamentally different from an LLM hallucinating plausible-sounding symptoms.
Financial modeling, engineering simulations, climate prediction…all domains where the “embodied experience” is simulation, not physical interaction. You learn how the world actually works by understanding constraint and causality, not by predicting the next token in a Bloomberg article.
The point isn’t “robots will finally work.” The point is: understanding causality is cheaper and more reliable than memorizing correlations. Embodiment is just the training signal that forces you to learn causality instead of surface patterns. LeCun’s betting that a system trained to predict abstract state transitions in any domain (be that medical, financial, physical) will generalize better / hallucinate less than one trained to predict text.
Whether that’s true? Fucked if I know - that’s why it’s (literally) the billion-dollar question.
But “it won’t cook dinner” misses the point (and besides which, it might actually cook dinner and change lightbulbs, so…)