Key Takeaways
On the underground forums where cybercriminals congregate, “dark” AI tools designed to facilitate fraud and other malicious activities are in high demand.
Among the most popular of these platforms is WormGPT, which saw its adoption surge in 2024 as criminals used it to automate phishing and business email compromise (BEC) attacks.
First launched in 2023, WormGPT is an unfiltered alternative to ChatGPT.
On their Telegram channel, the platform’s developers boast that it now supports 13 AI models, with DeepSeek R1 as the latest addition. But WormGPT doesn’t have any of the limitations imposed by other developers to prevent malicious use.
Like mainstream chatbots, WormGPT has a free version and a paid tier, which costs $18 per month. A Telegram bot version of the platform has now climbed to nearly 3,000 monthly users and hundreds of paying subscribers.
WormGPT is among a growing suite of “dark” AI tools being adopted by criminals.
Research by the cybersecurity firm Kela found that in 2024, there was a 219% increase in mentions of malicious AI tools on cybercriminal forums.
These include WormGPT, FraudGPT and other jailbroken chatbots that don’t block illegal activities.
Cybercriminals increasingly use AI tools to enhance the sophistication and effectiveness of phishing and social engineering campaigns.
Tactics include using deepfakes and AI-generated emails to trick victims into revealing sensitive information.
In one recent example highlighted by Kela, criminals even claim to have built AI-powered callbots that can automate voice phishing campaigns.
Platforms like WormGPT can also be used to write and edit malicious code, making it easier to deploy threats like ransomware.