Comment on Sam Altman Says If Jobs Gets Wiped Out, Maybe They Weren’t Even “Real Work” to Start With

<- View Parent
Passerby6497@lemmy.world ⁨1⁩ ⁨day⁩ ago

It comes down to the “hallucination rate” which is a very fuzzy metric, but it works pretty well - at a hallucination rate of 5% (95% successful responses) AI is just about on par with human workers - but faster for complex tasks, and slower for simple answers.

I have no idea what you’re doing, but based on my own experience, your error/hallucination rate is like 1/10th of what I’d expect.

I’ve been using an AI assistant for the better part of a year, and I’d laugh at the idea that they’re right even 60% of the time without CONSTANTLY reinforcing fucking BASIC directives or telling it to provide sources for every method it suggests. Like, I can’t even keep the damned thing reliably in the language framework I’m working on without it falling back to the raw vendor CLI in project conversations. I’m correcting the exact same mistakes week after week because the thing is braindead and doesn’t understand that you cannot use reserved keywords for your variable names. It just makes up parameters to core functions based on the question I ask it, regardless of documentation until I call it’s bullshit and it gets super conciliatory and then actually double checks it’s own work instead of authoritatively lying to me.

You’re not wrong that AI makes human style mistakes, but a human can learn, or at least generally doesn’t have to be taught the same fucking lesson at least once a week for a year (or gets fired well before then). AI is artificial, but there absolutely isn’t any intelligence behind it, it’s just a stochastic parrot that somehow comes to plausible answers that the algorithm expects that you want to hear.

source
Sort:hotnewtop