No, this is about me trying to fix their buggy ai code that they have no idea how it works and what it isn't working. If you can do your work completely on your own without issues then whatever but if you are breaking stuff and come to me along for help cause you don't know how your own code works then that's a massive problem. I don't mind teaching people, I actually enjoy it but that's only when you are putting in effort to learn it instead of copy pasting code from copilot.
Comment on AI Eroded Doctors' Ability to Spot Cancer Within Months in Study
mindbleach@sh.itjust.works 2 days agoIt sounds like this is about when they stopped using AI.
If they do better with it than without it, why optimize how good they are without it? Like, I know how to do math, by hand. But I also own a calculator. If the speed and accuracy of my multiplication is life-and-death for worried families, maybe I should use the calculator.
RogueBanana@piefed.zip 2 days ago
mindbleach@sh.itjust.works 1 day ago
Okay cool, that’s not what’s happening here.
These aren’t “vibe doctors.” They’re trained oncologists and radiologists. They have the skill to do this without the new tool, but if they don’t practice it, that skill gets worse. Surprise.
For comparison: can you code without a compiler? Are you practiced? It used to be fundamental. There must be e-mails lamenting that students rely on this newfangled high-level language called C. Those kids’ programs were surely slower… and ten times easier to write and debug. At some point, relying on a technology becomes much smarter than demonstrating you don’t need it.
If doctors using this tool detect cancer more reliably, they’re better doctors. You would not pick someone old-fashioned to feel around and reckon about your lump, even if they were the best in the world at discerning tumors by feel. You’d get an MRI. And you’d want it looked-at by whatever process has the best detection rates. Human eyeballs might be in second place.
RogueBanana@piefed.zip 1 day ago
I never implied they are vibe doctors? Its just a comment on my annoying experience, don't read to much into it.
mindbleach@sh.itjust.works 1 day ago
“Concerning that the same is happening in medical even for the experts.”
It isn’t.
Glad we cleared that up?
ChairmanMeow@programming.dev 2 days ago
If you’re doing it once, then that’s fine. But if you have to do it loads of times, and things keep getting more complex, you’ll find that you won’t be able to correctly use the tools anymore and spot its mistakes.
AI raises your skill level a bit, but also stumps your growth if used irresponsibly. And that growth may be necessary later on, especially if you’re a junior in the field still.
mindbleach@sh.itjust.works 1 day ago
Should urologists still train to detect diabetes by taste? We wouldn’t want the complexity of modern medicine to stunt their growth. These quacks can’t sniff piss with nearly the accuracy of Victorian doctors.
When a tool gets good enough, not using it is irresponsible. Sawing lumber by hand is a waste of time. Farmers today can’t use scythes worth a damn. Programming in assembly is frivolous.
At what point do we stop practicing without the tool? How big can the difference be, and still be totally optional? It’s not like these doctors lost or lacked the fundamentals. They’re just rusty at doing things the old way. If the new way is simply better, good, that’s progress.
ChairmanMeow@programming.dev 1 day ago
It’s true that if a tool is objectively better, then it makes little sense to not use it.
But LLMs aren’t that good yet. There’s a reason senior developers are complaining about vibecoding juniors; their code quality is often just bad. And when pressed, they often can’t justify why their code is a certain way.
As long as experienced developers are able to do proper code review, the quality control is maintained. But a vibecoding developer isn’t good at reviewing. And code review is an absolutely essential skill to have.
I see this at my company too. There’s a handful of junior devs that have managed to be fairly productive with LLMs. And to the LLMs credit, the code is better than it was without it. But when I do code review on their stuff and ask them to explain something, I often get a nonsensical, AI-generated response. And that is a problem. These devs also don’t do a lot of code review, if any, and when they do they often have very minor comments or none at all. Some just don’t do any reviews, stating they’re not confident approving code (which is honest, but also problematic of course).
I don’t mind a junior dev, or any dev for that matter, using an LLM as an assistant. I do mind an LLM masquerading as a developer, using a junior dev as a meat puppet, if you get what I mean.
mindbleach@sh.itjust.works 1 day ago
We’re not talking about LLMs.
These doctors didn’t ask ChatGPT “does this look like cancer.” We’re talking about domain-specific medical tools.
subignition@piefed.social 1 day ago
Because "AI" tools are unsustainable, and it would be better not to have destroyed your actual skill when the bubble eventually pops.
mindbleach@sh.itjust.works 1 day ago
This is not that kind of AI. It’s not an LLM trained on WebMD. You cannot reason about this domain-specific medical tool, based on your experience with ChatGPT.
Baggie@lemmy.zip 2 days ago
If your use a calculator, and it gives you back a number that can’t possibly be right, you know there’s an error somewhere along the line.
If you’ve never done multiplication before, you won’t have that innate sense of what looks right or wrong.
mindbleach@sh.itjust.works 2 days ago
“I can do math by hand.”
“But what if you can’t?”
Incorrect.
Baggie@lemmy.zip 2 days ago
It’s an analogy. It’s referring to the original comment where people don’t have the skills to recognise how or why something doesn’t work. The core problem is without that fundamental understanding of what you’re trying to do, you don’t know why something doesn’t work.
mindbleach@sh.itjust.works 1 day ago
No shit, it’s my analogy. And I made clear - the underlying skill still exists.
These doctors can still spot cancer. They’re just rusty at eyeballing it, after several months using a tool that’s better than their eyeballs.
X-rays probably made doctors worse at detecting tumors by feeling around for lumps. Do you want them to fixate on that skill in particular? Or would you prefer medical care that uses modern technology?