Voice cloning – an AI application that generates a synthetic copy of a person’s voice – can be a powerful assistant for those who’ve lost their voice due to accident or illness. It facilitates translations, speech therapy, and assistive reading. It helps professionals build compelling presentations. Studios use it to produce videos, audiobooks, podcasts, and video games. Financial institutions use it as a biometric authentication factor (voice print recognition). And so on. But, like many technological breakthroughs, voice cloning can also be used for crime.
Voice cloning has been around for years. Before, though, one had to download special software to synthesize or clone a voice. Now there’s a wide range of voice cloning services readily available on the web. Using them doesn’t require technical knowledge, and they’re so advanced now they can replicate a voice with uncanny accuracy.
Combined with generative AI like ChatGPT, voice cloning has become a powerful weapon in the hands of malicious actors. Scammers exploit it to impersonate individuals convincingly, often tricking victims into divulging sensitive information or money transfers.
With the public facing an uphill battle against scammers, here are a few examples of what to watch out for and, more importantly, how to protect yourself and your loved ones.
Imagine receiving a frantic call from your "child" claiming they’ve been in a car accident and need money immediately to avoid legal trouble. The voice on the other end sounds exactly like your son or daughter. This is the hallmark of the "emergency family member" scam.
The Trapp family in the San Francisco Bay Area suffered this trickery firsthand when they got a frantic call from their “son” saying he’d been in a car accident, injured a pregnant woman, and needed urgent help. The scammers posed not only as the son but also as police, instructing the distressed mother to quickly withdraw $15,000 in cash and hand it over to a courier already on his way to the family’s house.
Luckily the parents ultimately became suspicious and contacted police in the jurisdiction of the alleged accident. Then they contacted their son on his mobile and uncovered the scam.
How they do it:
Scammers obtain the family’s phone numbers, either from breach dumps, doxxing, or phishing. They obtain a sample of the child’s voice from social media, or a phone call. They then use that sample to create a convincing clone of the child’s voice, call the parents pretending to be the child in distress, and request immediate financial help.
How to spot it:
Defense tips:
Also known as "Business Email Compromise 2.0," this scam targets businesses by mimicking the voice of a high-ranking executive, like a CEO, to authorize fraudulent wire transfers.
In 2019, the chief executive of a UK firm believed he was on the phone with his boss at the German parent company, who’d ordered him to immediately move £220,000 (US $243,000) into what he thought was the bank account of a supplier. The scammer impersonating the big boss told the UK exec the payment was urgent and should be made within the hour. The UK-based CEO said he’d recognized the “slight German accent” of his boss and the “melody” of his voice on the phone.
How they do it:
Scammers use recordings of a CEO from interviews, speeches, or earnings calls to create a voice clone. They then call a finance officer or someone in charge of wire transfers within the firm and instruct them to send money to a specific account, claiming it’s for an urgent business deal.
How to spot it:
Defense tips:
Voice cloning is breathing new life into the age-old “help desk” scam. A fake customer service agent may call you using a cloned voice to extract personal information or payment details.
How they do it:
Picture this: You receive a call claiming to be from your bank. The ‘robotic’ voice matches the one you’ve heard many times in automated messages, instructing you to confirm your account details to resolve a "security issue." This familiarity makes the scam more convincing, as the cloned voice aligns with what you’ve heard during legitimate interactions with the institution. However, the call is meant to steal your access credentials.
The scammer might use this technique to set up the call and then follow with a more "personal" cloned voice claiming to be a representative. The progression from an automated-sounding voice to a seemingly real human voice adds yet another layer of believability.
How to spot it:
Defense tips:
If you’re alarmed by the scenarios above, you should be. A slip-up can compromise your digital privacy and security, and cost you your job, or even your life’s savings. But there are plenty of ways to protect yourself:
· Limit your digital footprint: Be mindful of sharing personal videos and audio online, as they can be used for voice cloning
· Stay vigilant: Always question unexpected calls that instill a sense of urgency
· Strengthen verification protocols: Use multi-factor authentication (MFA/2FA) for sensitive accounts and establish verbal security codes (passwords) with trusted friends and family members
· Stay informed: Regularly educate yourself on emerging threats and teach your family and colleagues to do the same. Read up on the cyber news to know what the scammers exploit as technology constantly evolves
· Use a scam detection tool: If you're suspicious of a certain phone call, email or text, consider using Scamio, our clever scam-fighting chatbot designed specifically to combat socially engineered fraud attacks. Simply describe the situation to Scamio and let it guide you to safety
Voice cloning technology is a double-edged sword, offering incredible potential for good while presenting new risks in the hands of bad actors. It’s important to know how scams are crafted and executed so you can take proactive steps to safeguard yourself. As technology evolves, so must our awareness and preparedness.
tags
Filip has 15 years of experience in technology journalism. In recent years, he has turned his focus to cybersecurity in his role as Information Security Analyst at Bitdefender.
View all posts