Comment on Researchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious queries

misterdoctor@lemmy.world ⁨8⁩ ⁨months⁩ ago

I don’t know anything about AI but I was trying to have Bing generate a photo of Jesus Christ and Satan pointing guns at the screen looking cool af and it rejected my prompt because of guns. So I substituted “guns” for “realistic looking water guns” and it generated the image immediately. I am writing my thesis tonight.

source
Sort:hotnewtop