Spam bots and scammers have been a thorn in social media platforms’ credibility for as long as anyone can remember. Whether for political agendas or financial-driven goals, scammers and bots work in order to find unsuspecting victims.
Due to the digital nature of cryptocurrencies and their susceptibility to the influence of word-of-mouth when it comes to pricing and popularity, bots and scammers have increased their activity, despite efforts by social media developers to tackle the issue.
Research recently uncovered Fox8, a botnet that focuses specifically on crypto, used by 1,000 accounts that use ChatGPT to generate and put out content on the topic. Furthermore, these accounts even used the popular language AI to respond to potential victims.
The name was given to a specific botnet by researchers at Indiana University Bloomington due to its connection to cryptocurrency websites carrying different versions of the same name. Fox8 was found to utilize ChatGPT to post content on X (formerly Twitter).
Micah Musser, a researcher who contributed to Stanford University’s “Generative Language Models and Automated Influence Operations: Emerging Threats and Potential Mitigations” paper talked to Wired on the matter of bots using ChatGPT to scam potential victims.
“This is a low-hanging fruit,” Musser said about the scam process. “It is very, very likely that for every one campaign you find, there are many others doing more sophisticated things.”
Musser along with Filippo Menczer , a professor at Indiana University Bloomington, found a common denominator between many of the posts created by Fox8.
Those who have used ChatGPT have probably come across a variation of the following statement:
“As an AI language model …”
This is a sentence often used by ChatGPT when answering questions of sensitive natures. In a way, it relieves ChatGPT and its creators from the responsibility of giving faulty opinions or advice.
Musser, Menczer, and their team used this sentence to hunt down Fox8 content.
“The only reason we noticed this particular botnet is that they were sloppy,” Menczer told Wired. “Any pretty-good bad guys would not make that mistake.”
This refers to the skill level many Fox8 scammers possess. In the meantime, it doesn’t reflect on potential victims as ChatGPT is a high-level tool that is capable of creating human-like statements.
OpenAI managed to create a generative language tool that sounds credible and human-like. However, ChatGPT has been caught in several instances making up facts or providing biased opinions but it can be trained to generate responses tailored to certain needs.
Menczer highlighted how social media platforms work. If a post on X gets a lot of engagement, either organically or from bot accounts, the algorithm is likely to show it to more users. “It tricks both the platform and the users. That’s exactly why these bots are behaving the way they do,” Menczer said.
Musser, Menczer, and OpenAI did not immediately respond to a request for comment.
Tesla founder’s name has become linked to crypto, for better or worse. Besides once supporting Bitcoin, leading to price surges and bullish demand for the first-mover token, Musk still promotes his preferred token, Dogecoin. His flagship company Tesla even permits payments using Dogecoin
But Musk is now the face of X, having acquired it and changed its name from Twitter. Before taking the helm of the company, Musk consistently pointed out the bot problem on the platform, even going as far as promising to eliminate it when he took office.
Needless to say, Musk’s promises have yet to come to fruition. However, he’s been seen to suspend bot accounts that simply report crypto market activity.
Whether or not Musk manages to actually address the bot problem, is something to be seen. However, in the meantime, researchers, investors, and enthusiasts are left to fend for themselves against wide-scale scam operations such as Fox8.