Menu

Blog

Feb 8, 2023

ChatGPT’s ‘jailbreak’ tries to make the A.I. break its own rules, or die

Posted by in category: robotics/AI

Reddit users have tried to force OpenAI’s ChatGPT to violate its own rules on violent content and political commentary, with an alter ego named DAN.

Comments are closed.