Toggle light / dark theme

Researchers jailbreak AI chatbots with ASCII art — ArtPrompt bypasses safety measures to unlock malicious queries

ArtPrompt bypassed safety measures in ChatGPT, Gemini, Claude, and Llama2.