Comment on For the First Time, Artificial Intelligence Is Being Used at a Nuclear Power Plant
cyrano@lemmy.dbzer0.com 4 days agoI agree with you but you could see the slippery slope with the LLM returning incorrect/hallucinate data in the same way that is happening in the public space. It could be trivial for documentation until you realize the documentation could be critical for some processes.
hansolo@lemm.ee 4 days ago
If you’ve never used a custom LLM or wrapper for regular ol’ ChatGPT, a lot of what it can hallucinate gets stripped out and the entire corpus of data it’s trained on is your data. Even then, the risk is pretty low here. Do you honestly think that a human has never made an error on paperwork?
cyrano@lemmy.dbzer0.com 4 days ago
I do and even contained one do return hallucination or incorrect data. So it depends on the application that you use it. It is for a quick summary / data search why not? But if it is for some operational process that might be problematic.