Comment on We have to stop ignoring AI’s hallucination problem

<- View Parent
assassin_aragorn@lemmy.world ⁨7⁩ ⁨months⁩ ago

It’s only easier to verify a solution than come up with a solution when you can trust and understand the algorithms that are developing the solution. Simulation software for thermodynamics is magnitudes faster than hand calculations, but you know what the software is doing. The creators of the software aren’t saying “we don’t actually know how it works”.

In the case of an LLM, I have to verify everything with no trust whatsoever. And that takes longer than just doing it myself. Especially because an LLM is writing something for me, it isn’t doing complex math.

source
Sort:hotnewtop