Compliance with anti-money laundering (AML) and know-your-customer (KYC) laws and regulations is often a major concern for digital asset companies. Despite efforts to monitor and detect fraud, digital asset companies face significant challenges due to advances in technological threats, including generative AI.
Gen AI can produce highly realistic deepfakes, fake documents, and can almost instantly weave a convincing life story to support false information, for example by feeding a person’s social media accounts with fake messages. For example, scammers recently used deepfake technology to simulate a video conference involving the CFO and other executives of a multinational financial company, and tricked an employee into transferring nearly $26 million to the scammers.
In short, the ability of next-generation AI to produce convincing deepfake sounds and images almost instantly may wreak havoc on existing governance systems designed to protect consumers.
Current KYC mechanisms are insufficient in the face of advances in the AI generation
AML and KYC programs are essential for financial institutions to verify the identity of their customers and ensure compliance with laws designed to combat money laundering, fraud, and terrorist financing. However, many cryptocurrency companies have weak or porous KYC controls, leading to an increased risk of fraud. According to Coindesk, cryptocurrency users lost nearly $4 billion to “scams, thefts, and hacks” in 2022 and an estimated $2 billion in 2023.
Since digital asset businesses typically do not have physical premises like traditional financial institutions, they must use KYC methods that are suitable for a remote environment. Commonly used KYC verification methods include:
- Taking a selfie while holding a handwritten sign with today’s date;
- Take a photo of the user’s driver’s license or other government-issued identification; or
- Recording a live video answering security questions to confirm the user’s identity and “liveness.”
However, AI-generated IDs can bypass these current verification methods. For example, services like OnlyFake use AI to create fake IDs that allegedly pass rigorous KYC checks on major cryptocurrency exchanges like Binance and Coinbase. These fake IDs are generated using neural networks and can be purchased for as little as $15. Deepfake Offensive Toolkit or point creates deepfakes for virtual camera injection and allows users to swap their face with an AI-generated face to pass identity verification. According to this article from The Verge, financial institutions’ KYC identity verification tests, which typically require a user to look at their phone or laptop camera, are easily fooled by AI-generated deepfakes point.
Using Generation AI in combination with blockchain can mitigate fraud enabled by Generation AI
Blockchain and AI are complementary technologies that can be effective for fraud detection and investigation, both independently and in combination.
Blockchain for verification
Decentralization, immutability, and rule-based consensus are some of the key features of blockchain technology that make it useful for identity verification and fraud detection. For example, transactions written to the blockchain are immutable (e.g., the data cannot be deleted or altered), which can prevent potential fraudsters from altering transaction data. Additionally, transactions written to public blockchains, such as the Bitcoin blockchain, are fully searchable and transparent, making it difficult for fraudulent activity to go undetected. Blockchains are also distributed by nature, making it more difficult for a single entity or a small group of entities to make unauthorized changes to blockchain data. Finally, data on blockchains can be cryptographically hashed, generating a unique digital fingerprint that is nearly impossible to recreate. This feature helps in tracking fraudulent transactions because if someone tampers with the blockchain data, the hash value would also change.
AI for detection:
AI can improve fraud detection by analyzing user behavior patterns and identifying anomalies in real time. Unlike blockchain technology, which is useful for auditing past transactions, AI can learn and adapt to potentially fraudulent behavior in real time. For example, AI’s advanced detection algorithms can analyze user behavior patterns and identify anomalies, flagging suspicious activity that deviates from normal usage. AI can quickly sift through mountains of data and identify subtle inconsistencies that often escape human detection. Machine learning models and AI-driven behavioral analysis allow AI to analyze user interactions like mouse movement patterns and typing style, which can add an additional layer of identity verification on top of the blockchain. AI’s ability to proactively monitor and detect fraud and blockchain’s ability to authenticate user identity and transaction validity is a powerful combination.
There is an urgent need to develop solutions as AI advances
Cryptocurrency-related cybercrime continues to grow as AI-powered deepfakes become more credible and realistic. However, in the face of this growing threat, several startups have developed AI-centric blockchain tools to combat fraud and other illicit activities in the digital asset sector.
For example, BlockTrace and AnChain.AI are two companies that are leveraging the synergies between blockchain technology and AI to combat cryptocurrency crime. BlockTrace, whose mission is to help governments and private companies combat cryptocurrency-related financial crime, recently partnered with AnChain.AI, a company that uses AI capabilities to combat fraud, scams, and financial crimes related to digital assets. BlockTrace and AnChain.AI will provide solutions that enable national security agencies to use AI to investigate smart contracts, conduct blockchain transaction investigations, and provide cybersecurity insights to national security officials.
The industry is on the verge of fully harnessing the potential of AI and blockchain to combat AI-enabled fraud, and there are many more developments to come given the dizzying speed at which AI is advancing.