Comment on Researchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious queries

<- View Parent
popekingjoe@lemmy.world ⁨3⁩ ⁨months⁩ ago

Drugs. Mostly. Probably.

source
Sort:hotnewtop