Artificial Intelligence News

Over 100,000 Stolen ChatGPT Account Credentials Sold on Dark Web Marketplace


June 20, 2023Ravie LakshmananEndpoint / Password Security


More than 100,000 compromised OpenAI ChatGPT account credentials have found their way on the illegal dark web market between June 2022 and May 2023, with India alone being responsible for 12,632 stolen credentials.

The credentials were found in information-stealing logs available for sale in the cyber-crime underground, Group-IB said in a report shared with The Hacker News.

“The number of logs available containing compromised ChatGPT accounts reached a peak of 26,802 in May 2023,” the Singapore-headquartered company said. said. “The Asia-Pacific region has experienced the highest concentration of ChatGPT credentials offered for sale over the past year.”

Other countries with the highest number of compromised ChatGPT credentials include Pakistan, Brazil, Vietnam, Egypt, US, France, Morocco, Indonesia and Bangladesh.

Further analysis revealed that most of the logs containing ChatGPT accounts had been breached by the notorious Raccoon info thief, followed by Vidar and RedLine.

Cyber ​​security

Information thieves have become popular among cybercriminals for their ability to hijack passwords, cookies, credit card and other information from browsers, and cryptocurrency wallet extensions.

“Logs contain compromised information captured by info thieves traded actively on the dark web marketplace,” said Group-IB.

“Additional information about the logs available on the market includes a list of domains found in the logs as well as information about the IP addresses of compromised hosts.”

Typically offered under a subscription-based pricing model, they not only lower the bar for cybercrime, but also serve as conduits to launch follow-up attacks using siphoned credentials.

“Many companies are integrating ChatGPT into their operational flows,” said Dmitry Shestakov, head of threat intelligence at Group-IB.


“Employees enter confidential correspondence or use bots to optimize proprietary code. Given that ChatGPT’s default configuration preserves all conversations, this could inadvertently offer sensitive intelligence sets to threat actors if they acquire account credentials.”

To mitigate such risks, it is recommended that users follow appropriate password hygiene practices and secure their accounts with two-factor authentication (2FA) to prevent account takeover attacks.

This development comes amidst an ongoing malware campaign that leverages fake OnlyFans pages and adult content feeds to deliver a remote access and information-stealing trojan called DCRat (or DarkCrystal RAT), a modified version of AsyncRAT.


🔐 Mastering API Security: Understanding Your True Attack Surface

Discover untapped vulnerabilities in your API ecosystem and take proactive steps towards tight security. Join our insightful webinar!

Join a Session

“In the case observed, the victim was persuaded to download a ZIP file containing a manually executed VBScript loader,” eSentire researchers saidnoted that this activity had been going on since January 2023.

“The file naming convention indicates that victims were persuaded to use explicit photos or OnlyFans content for various adult film actresses.”

It also follows the discovery of a new VBScript variant of malware called GuLoader (aka CloudEyE) which uses a tax-themed bait to launch PowerShell scripts capable of fetching and injecting Remcos RATs into legitimate Windows processes.

“GuLoader is a highly evasive malware loader commonly used to deliver info thieves and Remote Administration Tools (RAT),” Canadian cybersecurity firm said in a report published earlier this month.

“GuLoader leverages user-initiated scripts or shortcut files to execute multiple rounds of highly obfuscated commands and encrypted shell code. The result is a memory-dwelling malware payload that operates within legitimate Windows processes.”

Found this article interesting? Follow us on Twitter And LinkedIn to read more exclusive content we post.


Source link

Related Articles

Back to top button