It starts with a simple question: “Can you hear me?”
You say “Yes”, without thinking. Then the caller hangs up. Strange, right? But that one word, recorded with or without your consent, may now be part of a scam you have never even heard of.
In 2025, cybercriminals no longer need only your password or OTP. They need your voice. From AI voice cloning to fraudulently authorizing services in your name, your “yes” can unlock real damage.
So before you answer the next unknown call, here is what you need to understand.
Why Saying “Yes” on a Call Is not Harmless Anymore
Most people think phone scams are obvious: a robotic voice asking for your bank info or a caller pretending to be your network provider.
So when a stranger simply asks, “Can you hear me?” It feels harmless to respond with a quick “yes”. But that is exactly the trap.

In today’s digital world, your voice is not just a sound; it is a biometric signature. Banks, telecom companies, and even customer service bots now use voice authentication. And scammers have found a way to adapt and take advantage of that to scam people.
That one word, “yes”, you said during a scam call, can be recorded, isolated, and used to:
Authorize charges in your name
Fool AI voice recognition systems
Train deepfake models that sound exactly like you
The worst part? You may not even know it happened until it’s too late.
How the “Say Yes” Scam Works, Step by Step
Scammers are not just trying to talk to you. They are trying to record you, isolate your voice, and use it against you. Here is how the scam typically plays out:
Step 1: The Setup: A Seemingly Innocent Call
You get a call from a number that looks local or familiar. The voice on the other end might say:
“Can you hear me?”
“Is this [they mention your name]?”
“Are you the homeowner?”
“Is the car in good condition?”
These questions are crafted to trigger a reflexive response, especially “yes”.

Step 2: Voice Recording Begins
The call is being recorded from the moment you answer, or on some occasions the scammer may stage a transfer that notifies you that the call is being recorded. They are just looking for a clear recording of you saying “yes”, preferably with no background noise, hesitation, or overlapping speech.
Step 3: Audio Is Isolated and Edited
Using basic audio editing tools, they extract the clean “yes” and attach it to pre-recorded audio or forms of consent.
This might include:
Authorizing new services
Verifying identity during account recovery
Faking a voice confirmation for customer service systems
Step 4: The “Yes” Is Used or Sold
In advanced scams, your “yes” becomes part of a larger profile and may be:
Used to train AI deepfakes that sound like you
Paired with leaked data from previous breaches to reset accounts
Sold on dark web markets alongside your phone number or email.
Step 5: The Damage Is Done
You might not notice anything wrong immediately. But over time, you may start seeing:
Charges you did not approve
Messages from services you didn’t sign up for
People telling you, “I swear I just spoke to you…”
That’s how subtle and dangerous this scam really is.
5 Alarming Ways Scammers Use Your Voice Against You
1. Voice Biometric Hijack (VBH)
Scammers are exploiting voice biometrics where your voice serves as your password in sectors like mobile banking, telecom recovery systems, insurance, and corporate support centers.
Fraudsters record short clips such as “yes”, “that’s right”, or “my name is” and, using tools like ElevenLabs or Descript, generate synthetic replicas of your voice. With these, they can bypass authentication, request SIM resets, gain account access, or authorize unauthorized transfers.
To demonstrate how possible this attack is, a tech journalist in a May 2025 experiment used a cheap voice synthesizer to mimic her own voice and successfully navigated a bank’s authentication system for five minutes, demonstrating just how easily these systems can be fooled.
2. Synthetic Voice Deepfakes
With just a short recording of your voice, bad actors can create deepfake audio and use it to scam your loved ones or colleagues at work.
In a now-infamous case from January 2020, a UAE branch manager received what seemed to be a call from his company’s CEO, supported by convincing emails. The manager authorized a transfer totaling US $35 million to various global accounts, only for investigators to later confirm that a deepfake voice had deceived him.
Similarly, in 2019, fraudsters cloned the voice of a UK energy company’s CEO and convinced an executive to transfer around €220,000 to a bank in Hungary.
A separate incident in Hong Kong in 2024 involved an employee transferring $25 million after a video call with deepfake versions of company executives.
3. Psychological Priming for Consent-Based Fraud
Scammers often use a recorded “yes” as fake consent. For example, a victim’s “yes” could be spliced into a manipulated voicemail or approval call and used to falsely claim the victim signed up for something from loans and utilities to medical services.
4. Callback Spoofing & Voicemail Attacks
Attackers using caller ID spoofing or VoIP systems may trick victims into providing voice confirmations. These recordings can be used to reset voicemails or bypass voice-based two-factor authentication; for instance, enabling unauthorized access to services like Apple ID.
5. Dark Web “Voice Token” Trading
Criminals bundle your voice recordings, often including your name and metadata, into “voice tokens” that are sold for around $10 to $50. Fraud farms then use these to train AI impersonation bots.
Veriff notes that one in twenty biometric verification attempts in financial services is now fraudulent, a 21% increase year-over-year.
Smart Alternatives to Saying “Yes” on Scam or Unknown Calls
When they ask: “Can you hear me?”
“I can hear you.”
“You’re coming through.”
“The line is clear.”
When they ask: “Is this [Your Name]?”
“Who’s asking?”
“May I ask who’s calling and why?”
“This number is monitored. Please identify yourself.”
When they ask: “Is this the owner of the home/phone/business?”
“Why do you need to know?”
“What’s this about?”
“I don’t confirm details over the phone. Who are you with?”
If you’re not sure who’s calling:
“I don’t take unsolicited calls. Please send this request in writing.”
“I’ll call back using your official company number.”
If pressured to agree:
“I don’t consent to any recording or transaction.”
“I’m not interested. Please remove my number.”
“I’ll consult my lawyer/IT department before continuing.”
Always remember, the convenience of voice technology comes with a hidden cost: once your voice is recorded, it can be cloned, traded, and weaponized without your knowledge. From bypassing bank security to tricking your closest contacts, voice-based scams are growing faster than most people realize.
While individuals must be cautious, avoiding casual confirmations like “yes” on unknown calls and choosing security methods that do not rely solely on biometrics, institutions also have a responsibility.
Banks, telecom providers, and service centers using voice authentication should integrate layered verification methods, such as multi-factor authentication and fraud-detection AI, to make deepfake attacks harder to execute.
Your voice is as unique as your fingerprint. Whether you are a customer or a company, protecting it should be a priority, because in the wrong hands, it can become a powerful tool for fraud.