WormGPT: the cybercriminal version of ChatGPT

It was only a matter of time before the famous ChatGPT was emulated for malicious purposes and one such tool now on the market is known as WormGPT. Should we be worried? Here’s everything you need to know.

When ChatGPT was made available to the public on November 30, 2002, the AI ​​chatbot took over the world.

The software was developed by OpenAI , a research and artificial intelligence company. ChatGPT it is a tool language processing able to answer questions and provide information based on a set of data collected from various sources, including books and online web pages, which has become an invaluable tool as an information gathering, analysis and writing activity for millions of users around the world.

While some experts believe that this technology could prove to be capable of disrupting the Internet, others believe that ChatGPT demonstrates “certain inaccuracy” in its responses. Many students have been caught plagiarizing courses via the tool, unless datasets are verified, AI chatbots like ChatGPT could become unwitting tools for spreading misinformation and propaganda.

In fact, the US Federal Trade Commission (FTC) is investigating Open AI , specifically its handling of personal information and the data used to create its language model.

Beyond data protection concerns, however, every time a new technological innovation that is useful to humans is created, there is also a parallel path of technology abuse . It was only a matter of time before chatboat was emulated for malicious purposes and one such tool is now on the market and is known as WormGPT .

What is WormGPT?

On July 13, researchers at cybersecurity firm SlashNext published a blog post revealing the discovery of WormGPT, a tool promoted for sale on a hacker forum.

According to the forum user, the WormGPT project aims to be a blackhat “alternative” to ChatGPT , “a tool that lets you do all sorts of illegal stuff and easily sell it online in the future.”

SlashNext gained access to the tool, which was described as an AI module based on the GPTJ language model. WormGPT was supposedly trained with data sources that include malware-related information , but the specific datasets remain known only to the author of WormGPT.

It may be possible for WormGPT to generate malicious code for example or convincing phishing emails.

What is WormGPT used for?

WormGTP is described as ” similar to ChatGPT but has no ethical limits or limitations “.

ChatGPT has a set of rules to try to prevent users from abusing the chatbot unethically. This includes refusing to complete crime and malware related tasks . However, users constantly find ways to bypass these limitations.

The researchers were able to use WormGPT to “generate an email intended to pressure an unsuspecting account manager into paying a fraudulent invoice“. The team was surprised by how well the language model handled the task, calling the result “remarkably persuasive [and] also strategically astute.”

While the researchers have not said anything about whether they have tested the malware writing service, it is plausible that the AI ​​bot could do so, given that the limits imposed on ChatGPT do not exist.

According to a Telegram channel launched to promote the tool, posts seen by ZDNET indicate that the developer is creating a subscription model for access, ranging from $60 to $700. One member of the channel, “darkstux”, says there are already over 1,500 WormGPT users.


Our WormGPT tool in action 👍
Buy/Questions – https://t.co/XXv2dm4UCC #WormGPT #AI #CyberSecurity #ChatGPT #OpenAI #WormAI #Worm #ArtificialIntelligence #Malware pic.twitter.com/toBb8NfU8D

— WormGPT (@wormgpt) July 17, 2023

Is WormGPT the same as ChatGPT?

No. ChatGPT was developed by OpenAI, a legitimate and respected organization . WormGMT is not their creation and is an example of how cybercriminals can draw inspiration from advanced AI chatbots to develop their own malicious tools.

Will there be more tools like WormGPT in the future?

Even in the hands of novices and the typical scammer, natural language models could turn easily avoidable phishing and BEC (Business Email Compromise) scams into sophisticated operations with a greater chance of success . There is no doubt that where money is to be made, cybercriminals will pursue any initiative, and WormGPT is just the beginning of a new range of cybercriminal tools intended to be traded in underground markets.

It is also unlikely that WormGPT is alone in the darkweb world.

What do regulators say about the abuse of AI tools?

Europol : Europol stated in the 2023 report, “The Impact of Large Language Models (LLM) on Law Enforcement”, assumes that “it will be crucial to monitor […] the development, as trained Dark LLMs (Large Language Models) to facilitate malicious output may become a key criminal business model of the future . This poses a new challenge for law enforcement, whereby it will become easier than ever for bad actors to perpetrate criminal activity without the necessary prior knowledge.”

Federal Trade Commission : The FTC is investigating ChatGPT maker OpenAI over data use policies and inaccuracies.

UK National Crime Agency (NCA): The NCA warns that AI could lead to an explosion of risk and abuse for young people.

UK Information Commission Office (ICO): The ICO has reminded organizations that their AI tools are still bound by existing data protection laws.

Can ChatGPT be used for illegal purposes?

WormGPT: the cybercriminal version of ChatGPT

Screenshot taken by the author


Not without hidden tactics, but with the right suggestions, many chatbot models can be persuaded to particular actions and tasks.

ChatGPT, for example, can draft professional emails, cover letters, resumes, purchase orders, and more. This alone can remove some of the most common indicators of a phishing email: spelling mistakes, grammatical errors, and problems with a second language. In itself, this alone could cause problems for companies trying to train their staff to recognize suspicious emails.

SlashNext researchers say that “cyber criminals can use such technology to automate the creation of highly convincing fake emails , personalized to the recipient, thus increasing the chances of a successful phishing attack.”

How much does ChatGPT cost?

ChatGPT is free. The tool can be used to answer general questions, write content and code, or generate prompts for anything from creative stories to marketing projects.

There is a subscription option, ChatGPT Plus , that users can sign up for. The subscription costs $20 per month and provides users with access to ChatGPT during peak hours and otherwise, faster response times, and priority access to improvements and fixes.