Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots

MonkderZweite@feddit.ch ⁨11⁩ ⁨months⁩ ago

dangerous information

What’s that?

and offer criminal advice, such as a recipe for napalm

Napalm recipe is forbidden by law?

Am i the only one worried about freedom of information?

source
Sort:hotnewtop