Cybercriminals can get through OpenAI’s geofencing restrictions and have unfettered access to ChatGPT as more stolen ChatGPT Premium accounts are transferred, claims Check Point Research (CPR).
Account takeovers (ATOs) or the sale of stolen accounts to various online services is one of the most lucrative marketplaces in the hacker underground and on the dark web.
In the past, this market focused on accounts from financial institutions (banks, online payment systems, etc.), social networking, online dating, and stolen emails.
ChatGPT places geofencing restrictions on certain countries’ access to their platform, including Iran, China, and Russia.
The fact that users repeat the same password across numerous platforms is commonly exploited by cybercriminals.
By entering several sets of email and password combinations into specialized software (also known as an account checker), bad actors can use this information to launch an assault on a specific online platform in order to discover the credentials that match the login to the platform.
In a final takeover, a malicious actor seizes control of an account without the account holder’s knowledge.
Despite the fact that some perpetrators also give away the stolen ChatGPT premium accounts for free to advertise their services or tools to steal them, researchers claim that such accounts are sold.
Users can send requests to a target web application using the SilverBullet web testing tool. This program can be used for many other things, such as automated pen testing, unit testing with Selenium, data scraping and parsing, and more.
This program is widely used by cybercriminals to launch attacks against various websites using credential stuffing and account-checking techniques in order to steal user accounts for online platforms.
“Since SilverBullet is a configurable suite, conducting a checking or bruteforcing attack against a particular website necessitates the use of a “configuration” file that customizes this process for that website and enables cybercriminals to automate the theft of user accounts from that website,”
Another cyberterrorist goes by the handle “gpt4” and specializes in fraud and abuse against ChatGPT goods. In addition to a configuration for another automatic application that checks credentials, he also offers ChatGPT accounts for sale in his threads.
On March 20, an English-speaking online criminal started advertising a ChatGPT Plus lifetime account service that promised complete client happiness.
“The lifetime upgrade of a standard ChatGPT Plus account (created using the buyer’s email) costs $59.99, compared to OpenAI’s original legitimate pricing of $20 per month for these services.
To cut expenses, experts point out that this underground service additionally provides a $24.99, lifelong option to share access to a ChatGPT account with another cybercriminal.