Amazon Alexa’s ‘skills’ can ‘pose significant privacy, security risk,’ study warns

RALEIGH, N.C. — When most consumers use Amazon’s friendly voice-activated assistant, they probably think they’re just dealing with the famous Alexa. It turns out, however, Alexa is just a “middle woman” for countless third parties that could put your private information in harm’s way. Researchers from North Carolina State University reveal that Alexa has a number of vulnerabilities when dealing with the programs users interact with via the popular Amazon device.

“When people use Alexa to play games or seek information, they often think they’re interacting only with Amazon,” says study co-author Anupam Das in a university release. “But a lot of the applications they are interacting with were created by third parties, and we’ve identified several flaws in the current vetting process that could allow those third parties to gain access to users’ personal or private information.”

The danger stems from the thousands of programs, or skills, that can run on Alexa. These skills function like the apps on a smartphone, which do everything from play music to order groceries. Study authors say Amazon has sold at least 100 million Alexa devices and there are currently over 100,000 skills users can install. Since the vast majority of these programs are created by third parties and have access to homes all around the world, researchers set out to identify any security issues in this relationship.

The team used an automated program to gather over 90,000 different skills from seven different skill stores. They then created a review process which thoroughly analyzed each of these programs.

One of the first problems researchers discovered was that skill stores display the name of the developer publishing the program. While this may sound like a good thing, the study finds Amazon does not actually verify that this developer is the real one. Simply put, the skill’s developer can claim to be anyone they want.

Researchers warn that this makes it easy for a malicious attacker to register software under the name of a trustworthy company. From there, they can fool users into thinking the skill is coming from a reputable source — aiding in phishing attacks.

Are Alexa’s ‘skills’ pulling a bait-and-switch?

The report also shows that Amazon allows more than one skill to use the same activation phrase.

“This is problematic because, if you think you are activating one skill, but are actually activating another, this creates the risk that you will share information with a developer that you did not intend to share information with,” explains Das, an assistant professor of computer science at NC State. “For example, some skills require linking to a third-party account, such as an email, banking, or social media account. This could pose a significant privacy or security risk to users.”

Additionally, study authors showed that developers can change their program’s code after placing the skill in stores. Changing the code in the back end allows publishers to potentially alter service agreements Amazon initially approved for use. Developers could conceivably request more sensitive information from users than what Amazon was originally willing to allow.

“We were not engaged in malicious behavior, but our demonstration shows that there aren’t enough controls in place to prevent this vulnerability from being abused,” Das says.

What is Amazon doing to police skills?

The study finds Amazon does have some privacy protections which guard a user’s information. These mainly watch over personal data including location data, full names, and phone numbers.

One of these safeguards also requires skills which request user data to have a publicly available privacy policy. The policy has to explain why the developers want this data and how their skills will use it.

Unfortunately, the study reveals 23.3 percent of the tested programs which request user data either don’t have a privacy policy or their agreement is misleading or incomplete. Researchers say some of the skills tested requested private information even though their own policy states the program won’t request such data.

“This release isn’t long enough to talk about all of the problems or all of the recommendations we outline in the paper,” Das concludes. “There is a lot of room for future work in this field. For example, we’re interested in what users’ expectations are in terms of system security and privacy when they interact with Alexa.”

The NC State team outlined a number of recommendations on how Amazon can better secure their systems. These include Amazon validating the identities of skill developers and using audio or visual cues to inform users they’re using a skill created by a third party.

Researchers presented the findings at the Network and Distributed Systems Security Symposium 2021 in February.

Leave a Reply

Your email address will not be published. Required fields are marked *