Over the past few months, cybersecurity experts at Check Point Research (CPR) have been raising concerns about the potential risks posed by ChatGPT4.0 to online security. In recent developments, CPR has warned about the alarming rise in the trade of stolen ChatGPT Premium accounts, which allows cybercriminals to bypass OpenAI's geofencing restrictions and gain unrestricted access to ChatGPT.
The market for account takeovers (ATOs) has long been a thriving underground economy in the world of hacking and on the dark web. Initially focused on stealing financial service accounts, social media profiles, and emails, this illicit market has now set its sights on stolen ChatGPT accounts, especially the premium ones.
Increasing Trend in the Following Activities Related to stolen ChatGPT accounts
Since March 2023, CPR has observed an increasing trend in the following activities related to stolen ChatGPT accounts:
- Leak and Publication of Credentials: Cybercriminals leak and publish credentials to ChatGPT accounts, exposing unsuspecting users to potential harm.
- Trade of Stolen Premium Accounts: Malicious actors on the dark web are actively trading stolen premium ChatGPT accounts.
- Bruteforcing and Checkers Tools: Cybercriminals utilize brute forcers and checkers to attempt unauthorized access to ChatGPT accounts by running exhaustive email addresses and password lists.
- ChatGPT Accounts as a Service: Providers actively offer dedicated services to create premium ChatGPT accounts, frequently utilizing stolen payment card information.
The Main Concerns
So, why is there a growing market for stolen ChatGPT accounts, and what are the main concerns? One of the primary drivers is the geofencing restrictions that ChatGPT enforces to block access from certain countries like Russia, China, and Iran. Cybercriminals have found a way around these restrictions by leveraging the ChatGPT API and exploiting premium accounts, leading to a surge in demand for stolen accounts. Where there is demand, opportunistic cybercriminals seize the chance to profit from illegal activities.
Furthermore, recent discussions about ChatGPT's privacy issues, with countries like Italy banning the platform and Germany contemplating the same, have highlighted additional concerns. ChatGPT accounts retain the recent queries of their owners. Consequently, when cybercriminals access stolen accounts, they also obtain sensitive personal information, corporate data, etc.
Users' Habit of Reusing the Same Passwords Across Multiple Platforms
The trade of stolen accounts follows a pattern commonly seen in account takeover attacks. Cybercriminals exploit users' habit of reusing the same passwords across multiple platforms. By loading combinations of emails and passwords into account checkers, malicious actors target specific online platforms to identify credentials that match valid logins. Once they gain unauthorized access, these attackers take control of the accounts, leading to a complete takeover.
While some cybercriminals sell stolen ChatGPT accounts, others freely share premium accounts to promote their services or tools for account theft. One example discovered by CPR involved a cybercriminal sharing four stolen premium ChatGPT accounts, indicating their use of a ChatGPT account checker.
The Tools Used For Hacking
Hacking tools like SilverBullet are highly configurable and provide automated methods to perform credential stuffing and account-checking attacks on various websites, including ChatGPT accounts. CPR detected cybercriminals providing a configuration file for SilverBullet specifically tailored to target OpenAI's platform. This automated process can perform between 50 to 200 checks per minute (CPM) and may circumvent website protections through proxy implementation.
One cybercriminal uses the alias "gpt4," focusing solely on abusing and defrauding ChatGPT products. In their threads, they not only offer stolen ChatGPT accounts for sale but also provide configurations for other automated tools to check the validity of credentials.
An English-speaking cybercriminal has been adding to the complexity of this issue by advertising a ChatGPT Plus lifetime account service, with claims of 100% satisfaction guaranteed. However, this service, which costs $59.99 for a lifetime upgrade (compared to OpenAI's legitimate pricing of $20 per month), allows sharing access with another cybercriminal for a lifetime at a reduced price of $24.99. Suspectedly, cybercriminals are utilizing compromised payment cards to facilitate payments for these unauthorized upgrades.
A Significant Cybersecurity Risk – What To Do To Protect Yourself?
In conclusion, the rise in the market for stolen ChatGPT accounts poses significant cybersecurity risks. We advise users to maintain vigilance over their online accounts, refrain from using the same passwords across multiple platforms, and regularly update them to enhance their digital security.
OpenAI and other platform providers should also strengthen security measures to safeguard users' data and protect against account takeovers and unauthorized access. Users can actively contribute to a safer online environment by staying informed and cautious.