Cybersecurity Firm Reports Bad Actors Peddling Deepfake Tool to Circumvent Crypto Exchange Security Measures

Warning: Criminals are Using AI Tool to Target Crypto Exchange Platforms, Network Security Firm Reports

Criminals have developed a new artificial intelligence tool that targets crypto exchange platforms, posing a significant threat to the security of these platforms. According to a report by Cato Networks, bad actors are selling a deepfake tool in underground markets that allows users to bypass identity authentication tests on crypto exchanges.

The tool enables users to create fake accounts for money laundering purposes, posing a serious risk to the integrity of these platforms. Cato Networks warns that the use of this AI tool could lead to an increase in new account fraud, with losses amounting to billions of dollars.

The process involves generating fake credentials and images using AI-rendering websites, forging passports, and creating videos to pass facial recognition systems on crypto exchanges. By uploading fake government IDs, criminals can create new and verified accounts on these platforms in a matter of minutes.

To combat this new form of fraud, Cato Networks advises crypto exchanges to update their security systems and stay informed about the latest cybercrime trends. The firm emphasizes the importance of collecting threat intelligence and being proactive in defending against AI threats.

As threat actors continue to evolve and leverage new technologies to their advantage, it is crucial for organizations to remain vigilant and take proactive measures to protect their platforms and users. Stay informed and subscribe to receive email alerts for the latest updates on cybersecurity threats in the crypto space.

Disclaimer

This article was generated automatically and is not written or endorsed by the site’s editorial author.
Content may be lightly edited for factual clarity or accuracy when necessary.