Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Syrus@lemmy.world 11 months agoYou would need to know the recipe to avoid making it by accident.
Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Syrus@lemmy.world 11 months agoYou would need to know the recipe to avoid making it by accident.
echodot@feddit.uk 11 months ago
Especially considering it’s actually quite easy to make by accident.