Comment on Lawyers increasingly have to convince clients that AI chatbots give bad advice
a4ng3l@lemmy.world 2 weeks agoYeah well same applies for a lot of tools… I’m not certified for flying a plane and look at me not flying one either… but I’m not shitting on planes…
ToTheGraveMyLove@sh.itjust.works 2 weeks ago
But planes don’t routinely spit out false information.
WeavingSpider@lemmy.world 2 weeks ago
I understand what you mean, but… looks at Birgenair 301 and Aeroperu 603 looks at Qantas 72 looks at the 737 Max 8 crashes Planes have spat out false data, and in of the 5 cases mentioned, only one avoided disaster.
It is down to the humans in the cockpits to filter through the data and know what can be trusted. Which could be similar to LLMs except cockpits have a two person team to catch errors and keep things safe.
ToTheGraveMyLove@sh.itjust.works 2 weeks ago
So you found five examples in the history of human aviation, how often do you think AI hallucinates information? Because I can guarantee you its a hell of a lot more frequently than that.
WeavingSpider@lemmy.world 2 weeks ago
You should check out Air Crash Investigation, amigo, all 26 seasons, you’d be surprised what humans in metal life support machines can cause when systems breakdown.
a4ng3l@lemmy.world 2 weeks ago
If you can’t fly a plane chances are you’ll crash it. If you can’t use llms chances are you’ll get shit out of it… outcome of using a tool is directly correlated to one’s ability?
Sound logical enough to me.
DrunkenPirate@feddit.org 2 weeks ago
Sure. However, the outcome of the tool LLM always looks very likely. And if you aren‘t a subject matter expert the likely expected result looks very right. That‘s the difference - hard to spot the wrong things (even for experts)
a4ng3l@lemmy.world 2 weeks ago
So is a speedometer and an altimeter until you reaaaaaaaaly need to understand them.
I mean it all boils down to proper tool with proper knowledge and ability. It’s slightly exacerbated by the apparent simplicity but if you look at it as a tool it’s no different.
ToTheGraveMyLove@sh.itjust.works 2 weeks ago
Except with a plane, if you know how to fly it you’re far less likely to crash it. Even if you “can use LLMs” there’s still a pretty strong chance you’re going to get shit back due to its very nature. One the machine works with you, the other the machine is always working against you.
a4ng3l@lemmy.world 2 weeks ago
Nha that’s just plain wrong…also you can also fantastically screw flying a plane but so long you use LLMs safely you’re golden.
It also has no will on its own; it is not « working against you ». Don’t give those apps a semblance of intent.