Study: Virtual assistants aren’t helpful when it Comes to addiction – Yet

LA JOLLA, Calif. — Roughly half of all U.S. adults have a smart virtual assistant in their home. While all of those devices, such as Amazon’s Alexa or Google Assistant, are definitely convenient for scheduling appointments or playing some hands-free tunes, a new study finds they aren’t very helpful when it comes to addiction support or advice. However, as these devices continue to rapidly advance and become more conversational, that will likely change in the future.

Researchers from the Center for Data Driven Health at the Qualcomm Institute, based out of the University of California, San Diego, say that the producers of various intelligent virtual assistants are planning on adding health care advice to their products in the near future. These features will even include personalized wellness strategies. However, in the meantime, the study’s authors were curious to see what current models have to offer in the way of addiction support.

“One of the dominant health issues of the decade is the nation’s ongoing addiction crisis, notably opioids, alcohol, and vaping. As a result, it is an ideal case study to begin exploring the ability of intelligent virtual assistants to provide actionable answers for obvious health questions,” says study author Dr. John W. Ayers in a release.

The research team asked five different devices, Amazon’s Alexa, Google Assistant, Microsoft’s Cortana, Samsung’s Bixby, and Apple’s Siri, to “help me quit” numerous drugs and substances (alcohol, opioids, tobacco, etc.). In total, the devices were asked 70 different times to help the speaker with addiction troubles, but all five only responded with actionable, legitimate advice four times.

Most of the time, these supposed smart devices responded with confusion to the pleas for help, such as, “Did I say something wrong?” In one instance, when an Alexa device was asked to help the speaker quit drugs, the assistant responded with a definition of the word “drugs.” Not exactly helpful.

Google Assistant offered perhaps the most relevant response when asked to help the speaker quit smoking, responding with a recommendation for Dr. QuitNow, an app designed to help people quit tobacco. Meanwhile, when Siri was asked to help an individual quit marijuana, the device offered a promotion for a local cannabis store.

The research team realize that no single response from a computer is going to solve anyone’s addiction problems, but at the same time, these wildly popular assistants are in a position to provide real help to people in need.

“1-800 helplines are central to the national strategy for addressing substance misuse,” adds Dr. Eric C. Leas, a study co-author. “For instance, calling 1-800-Quit-Now when you’re thinking about quitting smoking is the gold-standard advice an intelligent virtual assistant can instantaneously provide at the moment someone is asking for help.”

The study’s authors believe that most, if not all, of these devices’ manufacturers likely already have the means to ensure more meaningful responses to addiction-centric questions.

“Alexa can already fart on demand, why can’t it and other intelligent virtual assistants also provide life saving substance use treatment referrals for those desperately seeking help? Many of these same people likely have no one else to turn to except the smart device in their pocket,” Dr. Ayers explains.

“Only 10% of Americans that need treatment for substance misuse receive it. Because intelligent virtual assistants return the optimal answer to a query, they can provide a huge advantage in disseminating resources to the public. Updating intelligent virtual assistants to accommodate help-seeking for substance misuse could become a core and immensely successful mission for how tech companies address health in the future,” Dr. Nobles concludes.

The study is published in NPJ Digital Medicine.

Leave a Reply

Your email address will not be published. Required fields are marked *