Privacy & Security

Voice Recognition to Prevent Fraud? Think Again!

Your voice is the newest security feature used to uniquely identify you. Smartphones and devices like the Amazon Echo let you add items to your personal calendar, make calls, or send emails using only your voice.

Amazon and Google also offer voice-activated shopping. With Amazon Alexa, you can order millions of products on Amazon.com. A Google Home smart speaker offers similar capabilities, and you can buy from any retailer tied into the system.

Banks have also climbed onto the voiceprint bandwagon. Citibank and HSBC, among other financial institutions, offer account authentication through voice recognition. Once you opt in, you can authenticate your account with your voice when you contact customer service by phone.

What could possibly go wrong? A lot, it turns out.

Last year, researchers demonstrated that hackers could pass instructions to voice-activated assistants like Alexa or Siri using ultrasonic frequencies humans can’t hear. Researchers in China developed a technique they call DolphinAttack to communicate with the voice recognition systems these assistants use. All a hacker needs to do is say something like, “Alexa, please send Mark Nestmann a

DeLonghi America ECAM28465M Prima Donna Fully Automatic Espresso Machine with Lattecrema System.” DolphinAttack converts the command into an ultrasonic frequency and sends it to your voice-activated assistant.

Pretty cool, huh?

To be honest, I’m not sure DolphinAttack could actually orchestrate theelivery of a $2,499 espresso maker to my mail receiving service. But the researchers did show that they could use DolphinAttack to force a voice-activated device to visit a website infected with malicious software (malware). The malware could then be surreptitiously installed on your device, allowing hackers to use your device to send fake emails and text messages. They could even make your device part of a “botnet” – a collection of internet-connected devices infected and controlled by malware.

Is your voice-activated device safe? Probably not. The researchers successfully tested DolphinAttack against 16 different devices using different versions of the operating systems used to power Apple, Android, and Windows products. Alexa, Cortana (the voice-activated assistant in Windows 10), Google Assistant, and Siri are all vulnerable. What’s more, the researchers demonstrated the attack would work not only in English, but in Chinese, French, German, and Spanish.

There is a relatively simple fix to this vulnerability: turn off voice activation on these devices. But doing so greatly reduces their functionality. The only long-term solution is for the manufacturers of these devices to modify the microphones so that they don’t respond to commands outside the frequency range of a human voice.

Unfortunately, DolphinAttack is only the tip of the iceberg when it comes to the mischief hackers can wreak with voice recognition. Last June, a BBC reporter demonstrated that the security software called Voice ID that banking giant HSBC uses could be deceived. To be fair, this wasn’t an attack anyone can pull off, because it was the reporter’s identical twin who managed to spoof the system and access his account.

Still, it was alarming that the system gave the twin a total of eight attempts to gain access. Another researcher discovered that Voice ID allowed them 20 separate attempts to access their account.

Worse, software is being developed that can accurately reproduce the voice of anyone, once it has a sufficient number of voice samples. A system that I’ve tested out is called Lyrebird. It allows you to create a digital voice that sounds like you using only one minute of audio as a sample. Once you’ve recorded a one-minute sample, you can generate any sentence or phrase you want with your digital voice.

The system isn’t perfect, but it’s pretty good.

Lyrebird suggests several possible applications for this technology:

  • Use your voice in personal assistants, for example reading tweets in the voice of the sender.

  • Create an avatar of yourself that can be used, for example, in video games.

  • Read audiobooks with the voice of your choice.

  • Use a personalized voice for people that have diseases and can’t speak anymore.

  • Freeze movie actor’s voices to make them available forever.

I foresee more nefarious uses. For instance, if someone wants to impersonate you, and can collect samples of your voice, he could use a tool like Lyrebird to create a digital voice. Your impersonator could then use your digital voice for whatever purpose he wants – to clean out your bank account, for instance.

I also predict a proliferation of “fake news” using this technology. Imagine a Breitbart video clip in which Nancy Pelosi confesses to embezzling millions of dollars. Or one on MS-NBC where Donald Trump admits that he conspired with Russia to fix the 2016 election.

There are no easy solutions to deal with the security vulnerabilities of voice recognition systems. Personally, I’ve made the decision to not purchase most voice-activated devices. I have a smart TV equipped with Wi-Fi so I can watch streaming video on Netflix, HBO, etc. But it doesn’t respond to vocal commands. And I will never purchase Alexa, Google Home, a smart thermostat, a smart door lock, or any other device that can be remotely activated.

Unless viable defenses to these attacks are developed (and I’m not holding my breath for that to happen), I suggest that you do the same.

On another note, many clients first get to know us by accessing some of our well-researched courses and reports on important topics that affect you.

Like How to Go Offshore in 2024, for example. It tells the story of John and Kathy, a couple we helped from the heartland of America. You’ll learn how we helped them go offshore and protect their nestegg from ambulance chasers, government fiat and the decline of the US Dollar… and access a whole new world of opportunities not available in the US. Simply click the button below to register for this free program.

About The Author

Free Consultation

Since 1984, we’ve helped 15,000+ customers and clients build their wealth protection plan.

Book in a free no-obligation  consultation and learn how we can help you too.

Get our latest strategies delivered straight to your inbox for free.

Get Our Best Plan B Strategies Right to Your Inbox.

The Nestmann Group does not sell, rent or otherwise share your private details with third parties. Learn more about our privacy policy here.

The Basics of Offshore Freedom

Read these if you’re mostly or very new to the idea of going offshore

What it Really Takes to Get a Second Passport

A second passport is about freedom. But how do you get one? Which one is best? And is it right for you? This article will answer those questions and more…

How to Go Offshore
in 2024

[CASE STUDY] How we helped two close-to-retirement clients protect their nest egg.

Nestmann’s Notes

Our weekly free letter that shows you how to take back control.