You really gotta hope those 5-10% LLM interpretive miss rares don’t hit your particular case.
Also if it fucks up, the doctor is still liable for malpractice right? Or do they get to kick that ball down into the abyss of trying to get LLM companies to take responsibility for their products.
Semi_Hemi_Demigod@lemmy.world 6 days ago
My fiancé works at a vet clinic that uses it and she says the doctors love it. The customers, however, don’t like that there’s an AI that listens to their visit so they just say it’s “software”
tyler@programming.dev 6 days ago
Transcription software has existed for decades and has no need for AI. It doesn’t need to interpret anything you’re saying, shoving AI into it is literally just making things worse.
Photuris@lemmy.ml 6 days ago
In this particular use case, no. The LLM not only transcribes, but it summarizes, drafts, and categorizes as well (ICD-10 codes, cross-referencing medical history, etc.).
Very useful for overworked and under-resourced healthcare workers.