The Silent Scam: Why Your ‘Hello’ Could Be Used Against You in AI Voice Fraud
In an era of rapidly advancing generative AI, a simple phone greeting is becoming a potential security vulnerability. A growing trend in telephony scams involves callers who remain silent upon answering, specifically to harvest voice samples for synthetic cloning.

This evolution in fraud highlights the increasing intersection of traditional social engineering and artificial intelligence, necessitating a shift in how users interact with unknown callers to protect their digital and financial identities.
The Tactic of the Silent Call
When users encounter a silent line after answering a call, the natural human instinct is to repeat “Hello?” or ask “Is anyone there?” multiple times. However, this behavior provides scammers with exactly what they need: a clean, isolated audio recording of the victim’s voice.
Rather than engaging in a traditional conversation, the attacker simply records the greeting. By capturing these snippets, bad actors can gather enough phonetic data to begin the process of voice synthesis.
AI Voice Cloning and the ‘Yes’ Trap
These recorded clips are then processed through AI voice synthesis tools. By mimicking the specific tone, pitch, and cadence of the target, attackers can create highly convincing deepfakes. This allows them to impersonate the victim during sophisticated social engineering attacks, often targeting family members or financial institutions to request urgent funds or sensitive information.
Beyond simple greetings, scammers specifically target the word “Yes.” A recorded affirmation can be spliced into a different conversation to falsely “authorize” a subscription, a service, or a payment, effectively using the victim’s own voice as a fraudulent signature.
How to Protect Yourself
To mitigate the risk of voice harvesting, security experts recommend a cautious approach to unknown numbers. If you answer a call and the other party remains silent, the safest course of action is to terminate the call immediately.
Users are advised to avoid repeating greetings or asking who is on the line if there is no immediate response. By hanging up quickly, you deny the attacker the audio samples required to build a convincing AI clone of your voice.