Menu

Blog

Oct 21, 2024

Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs

Posted by in categories: cybercrime/malcode, robotics/AI

When I was a kid we had the anarchist cookbook.


But an artist and hacker found a way to trick ChatGPT to ignore its own guidelines and ethical responsibilities to produce instructions for making powerful explosives.

The hacker, who goes by Amadon, called his findings a “social engineering hack to completely break all the guardrails around ChatGPT’s output.” An explosives expert who reviewed the chatbot’s output told TechCrunch that the resulting instructions could be used to make a detonatable product and was too sensitive to be released.

Leave a reply