Alexa Vulnerability Is A Reminder To Delete Your Voice History
If you haven’t been regularly deleting your voice history with Amazon’s voice assistant, Alexa, you could have a good reason to start: a recently fixed vulnerability that would’ve exposed all your conversations with the smart speaker.
On Thursday, researchers from cybersecurity firm Check Point released a report detailing security issues they discovered with Amazon’s Alexa, which would’ve allowed a potential hacker to get a person’s conversation logs with the smart speaker, as well as install skills on the device without the person knowing.
“The security of our devices is a top priority, and we appreciate the work of independent researchers like Check Point who bring potential issues to us. We fixed this issue soon after it was brought to our attention, and we continue to further strengthen our systems,” an Amazon spokesperson said in a statement.
The company said it was contacted by the researchers in June and that it hadn’t seen any cases of the vulnerability being used. But the security concerns serve as a strong reminder to minimize the amount of history logged with your smart speakers.
Connected devices at home present a new opening for hackers, and smart voice assistants are no different. Security researchers have frequently demonstrated flaws with Alexa, like a stranger yelling to unlock your door or a laser pointer being able to activate your device from 300 feet away.
Many of these concerns are mitigated by the fact that an attacker would need to be near your home or within your speakers’ range, but the security flaws found by Check Point would’ve needed just a single click, researchers said.
Amazon had a vulnerability with its subdomains — URLs like track.amazon.com, for example. Though you might be skeptical enough to avoid clicking on suspicious links, a URL with Amazon’s domain in it could be enough to make you believe you’re safe.
The security researchers discovered that they were able to inject code into the subdomain that would allow them to extract a security token tied to your Alexa account. Using that token, a potential attacker could pose as you to install skills, get a list of the skills you’re already using, and view your voice chat history with Alexa.
Depending on how sensitive your conversations with Alexa are, that could mean access to your health information, your finances, or just the silly day-to-day stuff you might ask the voice assistant.
“Smart speakers and virtual assistants are so commonplace that it’s easy to overlook just how much personal data they hold, and their role in controlling other smart devices in our homes,” Oded Vanunu, Check Point’s head of products vulnerabilities research, said in a statement. “But hackers see them as entry points into peoples’ lives, giving them the opportunity to access data, eavesdrop on conversations or conduct other malicious actions without the owner being aware. We conducted this research to highlight how securing these devices is critical to maintaining users’ privacy.”
Check Point said attackers could’ve started eavesdropping on conversations by installing a skill, but Amazon scans skills for any malicious activities, and blocks them from its marketplace. The voice history log is a bigger concern, and the vulnerability is a reminder that you should be regularly deleting your conversations with Alexa.
Like other voice assistant providers, Amazon keeps records of your voice history to improve its own artificial intelligence, and unless you opt out, human reviewers will listen to those conversations, too.
You can have your voice history set to delete automatically past three months or 18 months, but if you want it deleted every day or every week, you’ll need to do it manually.
With vulnerabilities like this, that’s a good practice, because of the potential for hackers to access those sensitive records. Ask yourself: Do the pros of having a history of your conversations with Amazon outweigh the cons?
Though deleting your voice history could keep you safe from potential hackers, you might still have some privacy concerns regarding Amazon’s policies.
In a letter to senators from July 2019, Amazon said it keeps some transcripts of voice recordings indefinitely, even when the audio itself is deleted.
READ MORE HERE