Isn’t there a way to do both for every patient as an additional information layer?
The dangerous part is not the AI, but the idea that AI can REPLACE everything. And that’s usually on the management.
Comment on Nurses Protest 'Deeply Troubling' Use of AI in Hospitals
ArbitraryValue@sh.itjust.works 6 months ago
My experience with the healthcare system, and especially hospitals, is that the people working there are generally knowledgeable and want to help patients, but they are also very busy and often sleep-deprived. A human may be better at medicine than an AI, but an AI that can devote attention to you is better than a human that can’t.
(The fact that the healthcare system we have is somehow simultaneously very expensive, bad for medical professionals, and bad for patients is a separate issue…)
Isn’t there a way to do both for every patient as an additional information layer?
The dangerous part is not the AI, but the idea that AI can REPLACE everything. And that’s usually on the management.
It really depends on how optimized the dataset is for the AI. If it is shitty, it will amplify biases or hallucinate. An AI might be able to give a patient more attention but if it is providing incorrect information, no attention is better than a lot of attention.
_lilith@lemmy.world 6 months ago
Those are all good points but that assumes that hospitals will use AI in addition to the workers they already have. The fear here would be that they would use AI as an excuse to lay off medical staff, making the intentional under staffing even worse and decreasing the overall quality of care while absolutely burning through medical staff.