Anyone who wants to make even slightly complex organic compounds will also need to study five different types of isomerism and how they determine major / minor product. That should be enough of a deterrent.
Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots
ninekeysdown@lemmy.world 11 months agoInfo hazards are going to be more common place with this kind of technology. At the core of the problem is the ease of access of dangerous information. For example a lot of chat bots will confidently get things wrong. Combine that easy directions to make something like napalm or meth then we get dangerous things that could be incorrectly made. (Granted napalm or meth isn’t that hard to make)
As to what makes it dangerous information, it’s unearned. A chemistry student can make drugs, bombs, etc. but they learn/earn that information (and ideally the discipline) to use it. Kind of like in the US we are having more and more mass shootings due to ease of access of firearms. Restrictions on information or firearms aren’t going to solve the problems that cause them but it does make it (a little) harder.
At least that’s my understanding of it.
emergencyfood@sh.itjust.works 11 months ago
MonkderZweite@feddit.ch 11 months ago
I don’t exactly agree with the “earned” part but guess you have a point with the missing ‘how to safely handle’.
ninekeysdown@lemmy.world 11 months ago
By earned I mean it takes some efforts to gain that knowledge. For example some kind of training, studying, practice, etc.