I’ve started using an AI driver for my car. And by “AI” I mean I use a bungee cord on the steering wheel to keep it straight. Straight is the correct answer 40% of the time, so it works out.
Oh, and by “my car”, I mean the people that work for me. I insist that they use my bungee-cord idea to steer their cars if they want to work for me. There may be a few losses, but that’s ok. I can always fire the ones that die and hire more.
I’m a genius.
Comment on Coinbase CEO explains why he fired engineers who didn’t try AI immediately
skulblaka@sh.itjust.works 7 months agoIn my left hand, I have a manfile, written by the very same people who wrote the tool or language that I’m trying to use. It is concise, contains true information, and won’t change if I look up the same thing again later.
In my right hand, I have a pathological liar, who also kinda sorta read the manfile and then smooshed it together with 20 other manuals.
I wonder which of these options is a more reliable reference tool for me? Hmm. It’s difficult to tell.
sturger@sh.itjust.works 7 months ago
8uurg@lemmy.world 7 months ago
In my experience that is not necessarily guaranteed, documentation is sometimes not updated and the information may be outdated or may even be missing entirely.
Documentation is much more reliable, yes, but not necessarily always true or complete, sadly enough.
skulblaka@sh.itjust.works 7 months ago
Sure, and I’ve also had my share of cursing at poor documentation.
If that’s the case then your AI is also going to struggle to give you usable information though.
8uurg@lemmy.world 7 months ago
My point was solely that human written documentation is far from as reliable as your comment insinuated it to be. Compared to an LLM it is reliable, but it is far from perfect.
In my view, an (my?) AI is going to struggle, whether or not the documentation is in order: those models already get confused by different versions of the same library having different interfaces and functions.