AI introduces a new dimension to the crypto crime problem by enabling scammers to more easily impersonate celebrities, according to blockchain intelligence firm Elliptic.
Artificial intelligence (AI) has the potential to impact the global economy, but it also poses significant risks by enabling new forms of crypto crime. Blockchain intelligence firm Elliptic highlights these concerns in a recent research report titled βAI-enabled crime in the cryptoasset ecosystem.β The London-based forensic firm points out that threat actors are already exploiting AI for illicit activities. The report warns that AI can be used to create convincing deepfakes of celebrities, politicians, and industry leaders, which scammers use to falsely legitimize fraudulent projects.
A string of deepfakes have specifically targeted Ripple (XRP) and its CEO, Brad Garlinghouse, particularly after the company won its court battle with the U.S. Securities Exchange Commission in July 2023.
Elliptic also noted that the hype around AI has led to the creation of GPT-themed tokens, which scammers promote by promising high returns. The firm identified βhundreds of tokens listed on several blockchainsβ that include the term βGPTβ in their name.
Some may indeed reflect well-intentioned ventures, but a number of them have been shilled in amateur trading forums where scammers claim some form of official association with ChatGPT or other supposedly legitimate AI companies.
The British blockchain forensic firm says the vast majority of AI-related threats in crypto are in their infancy, highlighting the need for vigilance and proactive measures to combat emerging forms of crypto crime.
To stay updated on the latest developments in the intersection of AI and cryptocurrency, explore more news on Global Crypto News.