CZ Warns of AI Deepfake Scams as Lazarus Targets Crypto Execs via Zoom

Crypto leaders are sounding the alarm over a new kind of scam that’s catching even seasoned professionals off guard: AI-powered deepfake impersonations over video calls. Former Binance CEO Changpeng Zhao (CZ) is the latest to speak out, warning the crypto community to stay vigilant as reports of these attacks grow.
AI already used in new types deepfake hacking. Even a video call verification will soon be out of the window. 😨😱
— CZ 🔶 BNB (@cz_binance) June 20, 2025
Don't install software from a non-official link, especially NOT from your "friends" (they are most likely hacked). https://t.co/kfRSDPiJWb
The threat isn't theoretical. Industry veterans from Japan to the U.S. have already fallen victim, with attackers using realistic deepfakes to gain trust over platforms like Zoom—only to deploy malware mid-call and compromise wallets, messaging accounts, and other sensitive tools.
One of the most concerning recent incidents came from Japanese crypto figure 9ounts.
“If I had known this kind of attack existed, I would’ve been more careful,” she said. “I want others to be aware and avoid the same mistake.”
A Pattern of Sophisticated Social Engineering
Fujimoto isn’t alone. Former Animoca Brands executive Mehdi Farooq reported a similar attack. In his case, deepfakes of not one, but two acquaintances were used to create a fake video meeting. Just like with Fujimoto, the attackers complained about audio issues and sent him a “fix.” After installing the malware, Farooq’s crypto wallets were emptied, wiping out most of his savings.
Executives from Manta Network, Mon Protocol, Stably, and Devdock AI have also flagged similar phishing attempts. The common thread? Realistic video calls, believable impersonations, and malware disguised as helpful downloads.
Cybersecurity analysts are tying the attacks to Lazarus Group, the North Korean state-linked hacker organization responsible for a long list of high-profile crypto heists. This new wave of attacks appears to be another evolution of their tactics—using AI to blur the lines between real and fake in ways that make even video verification unreliable.
Why Deepfakes Are Hard to Spot—and Dangerous
Deepfake technology has advanced rapidly, and fraudsters are now using it to impersonate everyone from crypto execs to public officials. According to a Bitget report, deepfakes accounted for 40% of all high-value frauds in 2024, making it one of the most dangerous tools in the scammer's arsenal.
CZ echoed these concerns, warning that traditional methods of verifying someone’s identity—like seeing their face on a call—may no longer be enough. “Stay alert,” he advised, especially when asked to click links or install software during calls, no matter how convincing they seem.