voice cloning
Photo Credit - Vika Strawberrika / Unsplash

How to Protect Yourself from Voice Cloning Scams

AI voice scams are on the rise, with 1 in 4 adults having experienced some kind of AI voice scam. Some 77% of victims said they had lost money as a result. A recent scandal with a fake Joe Biden robocall telling Democrats not to vote drew attention to the issue, with Google searches for ‘voice cloning scam’ soaring by +170% in the past month.

Alexander Konovalov, co-founder of Vidby, a company at the forefront of Technologies of Understanding and specializing in AI-powered voice translation solutions, offers six invaluable tips to avoid falling victim to voice cloning scams. These guidelines serve as essential tools for individuals navigating the increasingly sophisticated landscape of digital communication, providing practical advice on how to discern and protect against fraudulent activities that exploit voice cloning technology.

6 tips to avoid falling victim to voice cloning scams


  1. Verify their identity: Don’t take a familiar voice as proof that a caller is who they say they are, especially when discussing sensitive subjects or financial transactions. Ask them to provide as many details as possible: the name of their organization, the city they’re calling from, and any information that only you and the real caller would know.
  2. Test their reactions: Say something that a real person wouldn’t expect to hear. For instance, if scammers are using artificial intelligence to imitate an emergency call from your relative, say something inappropriate, such as “Honey, I love you”. Whereas a real person would react panicked or confused, AI would simply reply “I love you too”.
  3. Laugh: AI has a hard time recognizing laughter, so crack a joke and gauge the person’s reaction. If their laugh sounds authentic, chances are there’s a human on the other end of the line, at least.
  4. Listen for anomalies: While voice cloning technology can be convincing, it isn’t yet perfect. Listen out for unusual background noises and unexpected changes in tone, which may be a result of the variety of data used to train the AI model. Unusual pauses and speech that sounds like it was generated by ChatGPT are also clear giveaway that you’re chatting to a machine.
  5. Treat urgency with skepticism: Scammers often use urgency to their advantage, pressuring victims into acting before they have time to spot the red flags — If you’re urged to download a file, send money, or hand over information without carrying out due diligence, proceed with caution. Take your time to verify any claims (even if they insist there’s no time).
  6. Don’t overshare: Avoid sharing unnecessary personal information online or over the phone. Scammers often phish for private information they can use to impersonate you by pretending to be from a bank or government agency. If the person on the other end seems to be prying, hang up, find a number on the organization’s official website, and call back to confirm their legitimacy.

As AI voice scams continue to evolve, staying vigilant and informed is our best defense. The surge in sophisticated scams, highlighted by incidents like the fake political robocalls, underscores the urgency for awareness and caution. By employing strategies such as verifying identities, testing reactions, listening for anomalies, and treating urgent requests with skepticism, we can protect ourselves from falling prey to these cunning deceptions. Remember, in a world where technology blurs the line between reality and imitation, your skepticism is a powerful tool. Let’s commit to safeguarding our personal information and financial resources by applying these practical tips and fostering a culture of vigilance and resilience against the rising tide of voice cloning scams. Together, we can outsmart the scammers and secure our digital and personal well-being.

Scroll to Top

Stay Informed with Our Exclusive Newsletter!

Subscribe to our newsletter and never miss out on the latest updates, exclusive offers, and insightful articles.

We respect your privacy!