Voice Recognition to Prevent Fraud? Think Again!
Your voice is the newest security feature used to uniquely identify you. Smartphones and devices like the Amazon Echo let you add items to your personal calendar, make calls, or send emails using only your voice.
Amazon and Google also offer voice-activated shopping. With Amazon Alexa, you can order millions of products on Amazon.com. A Google Home smart speaker offers similar capabilities, and you can buy from any retailer tied into the system.
Banks have also climbed onto the voiceprint bandwagon. Citibank and HSBC, among other financial institutions, offer account authentication through voice recognition. Once you opt in, you can authenticate your account with your voice when you contact customer service by phone.
What could possibly go wrong? A lot, it turns out.
Last year, researchers demonstrated that hackers could pass instructions to voice-activated assistants like Alexa or Siri using ultrasonic frequencies humans can’t hear. Researchers in China developed a technique they call DolphinAttack to communicate with the voice recognition systems these assistants use. All a hacker needs to do is say something like, “Alexa, please send Mark Nestmann a
DeLonghi America ECAM28465M Prima Donna Fully Automatic Espresso Machine with Lattecrema System.” DolphinAttack converts the command into an ultrasonic frequency and sends it to your voice-activated assistant.
Pretty cool, huh?
To be honest, I’m not sure DolphinAttack could actually orchestrate theelivery of a $2,499 espresso maker to my mail receiving service. But the researchers did show that they could use DolphinAttack to force a voice-activated device to visit a website infected with malicious software (malware). The malware could then be surreptitiously installed on your device, allowing hackers to use your device to send fake emails and text messages. They could even make your device part of a “botnet” – a collection of internet-connected devices infected and controlled by malware.
Is your voice-activated device safe? Probably not. The researchers successfully tested DolphinAttack against 16 different devices using different versions of the operating systems used to power Apple, Android, and Windows products. Alexa, Cortana (the voice-activated assistant in Windows 10), Google Assistant, and Siri are all vulnerable. What’s more, the researchers demonstrated the attack would work not only in English, but in Chinese, French, German, and Spanish.
There is a relatively simple fix to this vulnerability: turn off voice activation on these devices. But doing so greatly reduces their functionality. The only long-term solution is for the manufacturers of these devices to modify the microphones so that they don’t respond to commands outside the frequency range of a human voice.
Unfortunately, DolphinAttack is only the tip of the iceberg when it comes to the mischief hackers can wreak with voice recognition. Last June, a BBC reporter demonstrated that the security software called Voice ID that banking giant HSBC uses could be deceived. To be fair, this wasn’t an attack anyone can pull off, because it was the reporter’s identical twin who managed to spoof the system and access his account.
Still, it was alarming that the system gave the twin a total of eight attempts to gain access. Another researcher discovered that Voice ID allowed them 20 separate attempts to access their account.
Worse, software is being developed that can accurately reproduce the voice of anyone, once it has a sufficient number of voice samples. A system that I’ve tested out is called Lyrebird. It allows you to create a digital voice that sounds like you using only one minute of audio as a sample. Once you’ve recorded a one-minute sample, you can generate any sentence or phrase you want with your digital voice.
The system isn’t perfect, but it’s pretty good. Listen to Donald Trump’s synthesized voice here and Barack Obama’s here. Obama’s voice even comes with an accompanying video track that mimics the movements of his mouth to make the recording appear even more realistic.
Lyrebird suggests several possible applications for this technology:
Use your voice in personal assistants, for example reading tweets in the voice of the sender.
Create an avatar of yourself that can be used, for example, in video games.
Read audiobooks with the voice of your choice.
Use a personalized voice for people that have diseases and can’t speak anymore.
Freeze movie actor’s voices to make them available forever.
I foresee more nefarious uses. For instance, if someone wants to impersonate you, and can collect samples of your voice, he could use a tool like Lyrebird to create a digital voice. Your impersonator could then use your digital voice for whatever purpose he wants – to clean out your bank account, for instance.
I also predict a proliferation of “fake news” using this technology. Imagine a Breitbart video clip in which Nancy Pelosi confesses to embezzling millions of dollars. Or one on MS-NBC where Donald Trump admits that he conspired with Russia to fix the 2016 election.
There are no easy solutions to deal with the security vulnerabilities of voice recognition systems. Personally, I’ve made the decision to not purchase most voice-activated devices. I have a smart TV equipped with Wi-Fi so I can watch streaming video on Netflix, HBO, etc. But it doesn’t respond to vocal commands. And I will never purchase Alexa, Google Home, a smart thermostat, a smart door lock, or any other device that can be remotely activated.
Unless viable defenses to these attacks are developed (and I’m not holding my breath for that to happen), I suggest that you do the same.
Protecting your assets (and yourself) against any threat - from the government, the IRS or a frivolous lawsuit - is something The Nestmann Group has helped more than 15,000 Americans do over the last 30 years.
Feel free to get in touch at firstname.lastname@example.org or call +1 (602) 688-7552 to learn how we can help you.
Want to learn more about us first?
Why not get instant access to my very popular e-course - Inside the World of Big Money Asset Protection. It tells the story of John and Kathy, two clients we helped from the heartland of America.
We subsidize copies of the course to new readers. In other words, it's yours free.
Many clients have used this program to really be clear about what they need to do - and how to get started. You likely will too.
To begin, we just need to know where to send it:
Share this article: