Toggle light / dark theme

ChatGPT’s ‘jailbreak’ tries to make the A.I. break its own rules, or die

Posted in robotics/AI

Reddit users have tried to force OpenAI’s ChatGPT to violate its own rules on violent content and political commentary, with an alter ego named DAN.