Menu

Blog

Aug 20, 2023

Meet FraudGPT: The Dark Side Twin of ChatGPT

Posted by in categories: cybercrime/malcode, robotics/AI

So, the bad AI has been arriving. On purpose. It reminds me of when hackers once or a few times checked traffic lights to both green and caused accidents resulting in human harm. That’s sad that such great tools are misused.


ChatGPT has become popular, influencing how people work and what they may find online. Many people, even those who haven’t tried it, are intrigued by the potential of AI chatbots. The prevalence of generative AI models has altered the nature of potential dangers. Evidence of FraudGPT’s emergence can now be seen in recent threads on the Dark Web Forum. Cybercriminals have investigated ways to profit from this trend.

The researchers at Netenrich have uncovered a promising new artificial intelligence tool called “FraudGPT.” This AI bot was built specifically for malicious activities, including sending spear phishing emails, developing cracking tools, doing carding, etc. The product may be purchased on numerous Dark Web marketplaces and the Telegram app.

What is FraudGPT?

Like ChatGPT, but with the added ability to generate content for use in cyberattacks, FraudGPT may be purchased on the dark web and through Telegram. In July of 2023, Netenrich threat research team members first noticed it being advertised. One of FraudGPT’s selling points was that it needs the safeguards and restrictions that make ChatGPT unresponsive to questionable queries.

Comments are closed.