If a Relative Calls Asking for Money, Pause First: How to Avoid AI Voice and Deepfake Scams
2026-05-03

Scam messages used to be easier to spot. Many had awkward grammar, strange formatting, or obvious mistakes. That is changing. AI can generate more natural writing, realistic voice clips, and even fake video. Scammers may no longer sound like strangers. They may sound like your child, parent, boss, bank representative, investment adviser, or platform support agent.
The most dangerous situation is a familiar-sounding voice combined with urgency: a car accident, an arrest, a frozen account, an emergency payment, or a business transfer that must happen immediately.
In April 2026, the FBI reported that the IC3 2025 Internet Crime Report received 1,008,597 complaints, with cyber-enabled crime causing nearly $21 billion in losses to Americans. The report also separately identified AI-related crime for the first time, with 22,364 complaints and nearly $893 million in losses. The FBI noted that criminals can use fake social profiles, voice cloning, identity documents, and convincing video to impersonate public figures or people you know. Source: FBI: Cryptocurrency and AI Scams Bilk Americans of Billions
The FTC also warns that voice cloning can make fraudulent requests more believable. If a call sounds like a boss asking for banking details or a family member asking for emergency money, people are more likely to act under pressure. The FTC has also supported work on detecting synthetic voice, real-time deepfake audio detection, and audio watermarking through its Voice Cloning Challenge. Source: FTC Consumer Advice: Fighting back against harmful voice cloning
Most households do not need advanced AI detection tools. They need a verification process. First, do not judge identity by voice alone. A voice that sounds right is not proof. Second, do not transfer money while still on the call. The more urgent the request feels, the more important it is to pause. Third, hang up and call back using the number already saved in your contacts, not the number that called you. Fourth, set a family passphrase in advance. If someone asks for emergency money, they must answer a question only the family knows.
Video calls are not perfect proof either. Deepfake video may show unnatural mouth movement, odd blinking, mismatched lighting, or audio that does not sync perfectly with facial expressions, but these signs are becoming harder to detect. Process is more reliable than visual inspection: contact the person through another channel, delay money movement, and treat requests involving gift cards, cryptocurrency, third-party accounts, or remote control software as high risk.
Practical Checklist
First, if a relative urgently asks for money, hang up and call the original number.
Second, create a family passphrase.
Third, never give verification codes to banks, support agents, or government callers.
Fourth, do not use screen sharing for financial issues.
Fifth, do not install remote control software requested by a caller.
Sixth, wait at least 10 minutes before any large transfer and ask a second family member to confirm.
Seventh, if you suspect fraud, contact your bank immediately and preserve evidence.
This article is for general scam prevention information only and is not legal, financial, or cybersecurity advice.