Septimaeus@infosec.pub 21 hours ago
AI including LLMs are forevermore just tools in my mind. And we wouldn’t have OSHA/BMAS/HSE/etc if idiots didn’t do idiot things with tools. BUT some idiots are spared from their own idiocy only by lack of permission.
From who? Depends. Sometimes they need permission from authority: “god told me to!” Sometimes they need it from the mob: “I thought I was on a tour!” And sometimes any fucking body will do: “dare me to do it!”
And THAT in my mind is the danger truly unique to these tools, that they mimic a permission-giver better than any we’ve made. They’re perfect for activating this specific category of idiot and (likely) unparalleled ease-of-use scales that danger to large numbers.
As to whether these idiots wouldn’t have just found permission elsewhere, who knows, but surely some kind of training prereq is warranted, right? That’s common with potentially dangerous tools. Or am I overthinking it?
Regrettable_incident@lemmy.world 21 hours ago
The LLM just told me to come round to your house and crap in your begonias. You might want to avoid looking out the window until I’m done.
Septimaeus@infosec.pub 19 hours ago
lol and with that you’re a better friend to the begonia’s than I
WhyJiffie@sh.itjust.works 12 hours ago
that sounds like a regrettable incident