You receive an email from your CEO asking you to join an urgent video call. You join, and there they are — your CEO, your CFO, familiar faces, familiar voices. They ask you to process an urgent wire transfer. Everything looks and sounds legitimate. Except none of them are real.
This isn't science fiction. This exact scenario played out at a Hong Kong finance firm in early 2025, resulting in a $25.6 million loss. And in 2026, these attacks are becoming alarmingly common.
What Are Deepfake Email Scams?
Deepfake email scams are a new evolution of business email compromise (BEC). They start like any phishing attack — with an email. But instead of relying solely on AI-generated text, attackers escalate to cloned voices and synthetic video to impersonate trusted people.
The attack chain typically looks like this:
Why Deepfake Scams Are So Effective
Traditional phishing relies on text. You can learn to spot suspicious wording, check sender addresses, and verify links. But deepfakes exploit something much harder to override: trust in what you see and hear.
- Voice cloning needs only seconds of audio — A short clip from a conference talk, podcast, or even a voicemail greeting is enough for AI to clone someone's voice
- Real-time video deepfakes are now possible — Attackers can generate a live video feed of anyone during a call
- Multi-channel attacks bypass skepticism — An email alone might trigger suspicion, but an email followed by a "personal" video call feels legitimate
- Urgency is weaponized — Attackers use time pressure to prevent victims from verifying through other channels
Real-World Deepfake Scam Cases
The $25.6 Million Video Call
A finance department employee at a Hong Kong firm received an email from the company's UK-based CFO requesting an urgent transfer. Suspicious at first, the employee was invited to a video call where they saw and spoke with the CFO and other colleagues — all deepfakes. Convinced by what they saw, they processed 15 transactions totaling $25.6 million before the fraud was discovered.
The Accounting Firm Campaign
In 2025, a phishing campaign targeted 800 accounting firms with AI-generated emails referencing specific state registration details. The campaign achieved a 27% click rate — far above the industry average. Several firms reported follow-up deepfake voice calls from supposed "clients" requesting sensitive financial documents.
The Family Emergency Scam
It's not just businesses. Criminals are using voice cloning to call families, impersonating loved ones in distress. "Mom, I've been in an accident, I need money right now." The voice sounds exactly like their child. The emotional pressure is overwhelming.
How to Detect and Prevent Deepfake Scams
1. Establish a Safe Word
The FTC and major cybersecurity firms now recommend a "Family Safe Word" — a unique, nonsensical phrase that is never shared online. Use it to verify identity on calls. If someone claiming to be your CEO or family member can't provide the safe word, hang up and verify independently.
2. Never Act on Urgent Financial Requests via Email Alone
Make it company policy: no wire transfers, no credential sharing, and no sensitive data sent based solely on email or video call requests. Always verify through a separate, pre-established channel.
3. Call Back on a Known Number
If you receive an unusual request — even from a video call that looks legitimate — hang up and call the person back on a phone number you already have saved. Never use a number provided in the suspicious email.
4. Check the Email First
Every deepfake scam starts with an email. Before the attack escalates to voice or video, the initial email can reveal the fraud. Check:
- Sender's actual email address (not just the display name)
- SPF, DKIM, and DMARC authentication results
- Any links in the email — hover before clicking
- Whether the domain matches the real company
5. Watch for Deepfake Artifacts
Current deepfake technology still has limitations. During video calls, look for:
- Unnatural blinking patterns or lack of blinking
- Slight audio-video sync issues — lips not perfectly matching words
- Unusual lighting or skin texture, especially around the edges of the face
- The person avoiding turning their head to extreme angles
- Background inconsistencies or warping
6. Enable Multi-Factor Verification for Transactions
Require multiple people to approve large transactions. A single person should never have the authority to process major wire transfers based on one communication channel.
What to Do If You Suspect a Deepfake Scam
- Stop the transaction immediately — Contact your bank if money has already been sent
- Report it — File a report with the FTC (US), Action Fraud (UK), or your local cybercrime authority
- Preserve the evidence — Save the original email, call recordings, and any other communication
- Alert your organization — If it happened at work, notify IT security and management immediately
- Forward the email to us — We can analyze the technical details of the initial phishing email
Let Us Check the Email
Deepfakes are convincing, but the email that starts the attack can't fake technical authentication. Forward any suspicious email to us, and we'll analyze sender verification, email headers, links, and domain reputation in seconds:
Catching the phishing email before it escalates to a deepfake call is your strongest defense.