Comment on As AI enters the operating room, reports arise of botched surgeries and misidentified body parts
Armand1@lemmy.world 1 week ago
Hmmm…
As the article correctly states, machine learning (“AI” is a misnomer that has stuck imo) has been used successfully for decades in medicine.
Machine learning is inherently about spotting patterns and inferring from them. The problem, I think, is two-fold:
-
There are more “AI” products than ever, not all companies build it in responsibly and it’s difficult for regulators to keep up with them.
-
As AI is normalised, some doctors will put too much trust in these systems.
This isn’t helped by the fact that the makers of these products are likely to exaggerate the capabilities of their products. This may be reflected in the products themselves, where they may not properly communicate the degree of certainty of a diagnosis / conclusion (e.g. “30% certainty this lesion is cancerous”)
HubertManne@piefed.social 1 week ago
It seems like a lot of ai problems is how people treat it. It needs to be treated like a completely naive and inexperienced intern or student or just helper. Everyone should expect that all output has to be carefully looked over like a teacher checking a students work.