Home » Latest News » Tech » Voice Cloning Scam: Protect Yourself from AI Fraud

Voice Cloning Scam: Protect Yourself from AI Fraud

by Sophie Williams
0 comments

Criminals are leveraging cloned voices to defraud individuals and their networks, often requesting financial transfers under false pretenses. The scam typically targets friends and family, with requests for bank transfers or money as if originating from the voice’s owner. Contact information is frequently obtained through social media and data breaches, accessed via illegal data panels and compromised systems.

Experts say the increasing accessibility of artificial intelligence has made this type of fraud more widespread, extending beyond specialized cybercrime circles. Hiago Kin Levi, president of the Brazilian Institute of Cyber Incident Response, notes that criminals with limited technical skills are now capable of simulating voices.

How to Protect Yourself from Voice Cloning Scams

Silencing calls from unknown numbers can reduce the risk of falling victim to calls designed to capture your voice. Users can configure their Android or iPhone devices to receive calls only from known contacts.

If you answer a call and it’s silent, avoid saying “hello” or “yes” and wait for the other party to speak. São Paulo’s Civil Police recommends remaining silent and refraining from sharing any sensitive information during suspicious calls.

Even when someone does speak, exercise caution and be wary of urgent requests for money. Hiago Kin Levi explains that AI technologies are becoming more sophisticated and can utilize a human-like voice—sometimes even addressing the person by their first name.

The rise of voice cloning technology, powered by artificial intelligence, is creating new avenues for fraud. Criminals are now able to convincingly mimic voices, opening the door to sophisticated scams targeting individuals and their close contacts. Authorities are warning the public to be vigilant and capture steps to protect themselves from this emerging threat. A growing concern is the use of AI to replicate voices for fraudulent purposes. Scammers are reportedly contacting individuals’ networks – friends and family – and requesting money transfers, falsely claiming the requests originate from the voice’s owner. These schemes often rely on publicly available data, obtained through social media and compromised data panels, as detailed in reporting by UOL. According to experts, the proliferation of AI has democratized access to voice cloning technology, making it available to criminals with even limited technical expertise. “Criminals with less technical knowledge are now able to simulate voices,” says Hiago Kin Levi, president of the Brazilian Institute of Cyber Incident Response. To mitigate the risk, authorities recommend several preventative measures. Silencing calls from unknown numbers is a key step, as is configuring mobile devices to accept calls only from known contacts – a feature available on both Android and iPhone. If a call is answered and remains silent, individuals are advised to avoid responding with “hello” or “yes,” and instead wait for the caller to initiate conversation. The São Paulo Civil Police further advises against providing any sensitive information during suspicious calls. Even when a caller does speak, caution is paramount, particularly when faced with urgent requests for money. Levi cautions that current AI technologies are increasingly sophisticated, capable of generating remarkably human-like voices and even addressing individuals by name. This underscores the demand for heightened awareness and skepticism when receiving unexpected requests, even from seemingly familiar voices.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy