Home / News / Technology / Security / Criminal Use of Dark AI Tools Soared 219% in 2024 as Jailbroken Chatbots Proliferate
Security
2 min read

Criminal Use of Dark AI Tools Soared 219% in 2024 as Jailbroken Chatbots Proliferate

Published
James Morales
Published

Key Takeaways

  • Cybercriminals increasingly use “dark” AI tools to facilitate malicious activities.
  • Uncensored chatbots like WormGPT are used to automate fraud and hacking.
  • In 2024, mentions of dark AI tools on cybercrime forums increased by 219%.

On the underground forums where cybercriminals congregate, “dark” AI tools designed to facilitate fraud and other malicious activities are in high demand.

Among the most popular of these platforms is WormGPT, which saw its adoption surge in 2024 as criminals used it to automate phishing and business email compromise (BEC) attacks.

What Is WormGPT?

First launched in 2023, WormGPT is an unfiltered alternative to ChatGPT.

On their Telegram channel, the platform’s developers boast that it now supports 13 AI models, with DeepSeek R1 as the latest addition. But WormGPT doesn’t have any of the limitations imposed by other developers to prevent malicious use.

Like mainstream chatbots, WormGPT has a free version and a paid tier, which costs $18 per month. A Telegram bot version of the platform has now climbed to nearly 3,000 monthly users and hundreds of paying subscribers.

Dark AI

WormGPT is among a growing suite of “dark” AI tools being adopted by criminals.

Research by the cybersecurity firm Kela found that in 2024, there was a 219% increase in mentions of malicious AI tools on cybercriminal forums.

These include WormGPT, FraudGPT and other jailbroken chatbots that don’t block illegal activities.

How Criminals Use AI

Cybercriminals increasingly use AI tools to enhance the sophistication and effectiveness of phishing and social engineering campaigns.

Tactics include using deepfakes and AI-generated emails to trick victims into revealing sensitive information.

In one recent example highlighted by Kela, criminals even claim to have built AI-powered callbots that can automate voice phishing campaigns.

Platforms like WormGPT can also be used to write and edit malicious code, making it easier to deploy threats like ransomware.

Was this Article helpful? Yes No
Although his background is in crypto and FinTech news, these days, James likes to roam across CCN’s editorial breadth, focusing mostly on digital technology. Having always been fascinated by the latest innovations, he uses his platform as a journalist to explore how new technologies work, why they matter and how they might shape our future.
See more
loading
loading