Archived version: archive.ph/XjlYL
Archived version: web.archive.org/…/ai-with-90-error-rate-forces-el…
I read about an early study into AI where they were using it to predict whether the pictured animal was a dog or a wolf. It got really good at detecting wolves and when they analyzed how it was determining whether it was a wolf or not, they found that it wasn't looking at the animal at all but instead checking if there was a lot of snow on the ground. If there was, it would say it was a wolf, if there wasn't it would say dog.
The problem was with the data set used to train the AI. It was doing exactly what it was told. That's the big problem with AI is that it does exactly what we tell it to do, but people are hilariously bad at describing exactly the result they want down to the absolute finest level of detail.
bernieecclestoned@sh.itjust.works 1 year ago
Guessing that’s not an error rate, just programmed to refuse first attempts to save cash
ellabee@sh.itjust.works 1 year ago
I was in medical billing about 20 years ago, specifically working to get ambulance billing paid by United Healthcare, Blue Cross, whatever. at that time I hated united slightly more than the VA. the VA was a year behind on payment, and they sent a lump check with the list of what it covered separate. but at least they kept track and paid.
we had to take United Healthcare to the insurance commissioner because their process was deny, then lose the claim, then deny for late billing.
instead of responding to the insurance commissioner or providing the requested docs or anything, they waited it out, paid the fine, paid the specific claims, and continued as usual.
so yeah. AI working the way they trained it.
EmergMemeHologram@startrek.website 1 year ago
Nonsense.
It’s most likely just a ~linear regression~ random forest model trained with the loss function of
sum((y - y_hat)^2 * cost)
I have that basic prediction tools are “AI” And they definitely were in 2019.
bernieecclestoned@sh.itjust.works 1 year ago
I was being facetious, I have no idea