Banner

Article

‘Hey Alexa, are you HIPAA compliant?’

While you may be tempted to bring your shiny new smart speaker to your office, you would be well advised to keep it at home until virtual assistance becomes well acquainted with HIPAA.

In a not so distant past, people would look at me with expressions of concern as I attempted to verbally coerce my computer into not crashing as I submitted by term paper minutes before the deadline. While talking to inanimate objects used to be a sign for concern (unless you’re David Hasselhoff accompanied by KITT from Knight Rider), nowadays it is an everyday occurrence for much of the population.

A significant majority of our personal electronics are embedded with virtual assistant software: Google’s Google Assistant, Apple’s Siri, and Amazon’s Alexa among them. With these virtual assistants safely residing in our devices, we can speak to our phones, watches, computers, and now even our speakers. The final contender of that list has exploded in popularity over the last year. Smart speakers were the hottest items for the holiday season, especially thanks to Black Friday sales that lingered through the end of the holidays. With Apple missing out on the holiday sales earlier this year due to the delay of its HomePod smart speaker, Google and Amazon dominated the smart speaker sales boasting sales in the millions of their respective devices, according to industry estimates.

Smart speakers are capable of understanding and implementing tens of thousands of actions based on simple voice commands. The short list of these tasks includes creating shopping lists, playing songs by your favorite band, even turning on and off your lights, and adjusting your home’s thermostat. Users can even purchase items simply by asking for it, and rest assured that their purchase is on its way by delivery. The list of tasks these smart speakers can manage is continuously growing.

While these devices were originally intended for home use, they have slowly trickled into our offices due to their massive potential and ability to keep track of calendar entries with a simple voice command. It goes without saying that many physicians and healthcare professionals will be tempted to utilize their functions for note taking, web research, or accessing medical records. But before you do, don’t. These virtual assistant programs are not yet in compliance with the Health Insurance Portability and Accountability Act (HIPAA).

The well-known goal of HIPAA is to protect patient protected health information . Private health information is the most sensitive, and incredibly under-secured, information in the world today. Ransomware hackers have increasingly targeted hospitals and physician offices for this reason. In particular, ransomware hackers focus their attacks on solo and small medical practices, because these practices tend to reduce spending on information technology, thereby becoming low-hanging fruit for sophisticated hackers.

Therefore, it is understandable that Alexa and Google Assistant need to jump through some more hoops before being allowed in a hospital room and access to medical records. Soon enough, Siri will need to follow their virtual steps. While Google and Amazon have worked on making their cloud services compliant with HIPAA’s standards, neither smart speaker with their respective virtual assistant is HIPAA compliant at this time. Physicians, hospitals, and practices should proceed-for the near term-as though Alexa, Siri, and Google Assistant are not HIPAA-compliant. Failure on the part of the physician to secure medical record data can not only cost them hundreds of thousands of dollars, but also enable thieves with the opportunity to commit identify theft.

Potential looms, but change is needed

Virtual assistants must first be taught (i.e., programmed) to avoid mistakes and abuse related to healthcare. For example, if hospitals utilize Alexa to draft hospital notes and include the ability to make orders for procedures or medications, hospital procedures would need to be implemented to prevent anyone who is not a physician from walking into someone’s room and creating an order. Also, if the smart speaker incorrectly “hears” the name of a medication and places an order for the wrong one, that would create obvious issues. Once the technology is more advanced and protections are in place, it will be up to hospitals to properly implement the voice-activated technology into the healthcare system.

This does not mean that your new, eager virtual assistant cannot be used for healthcare purposes. For example, getting your patient to use his or her smart home device to set a reminder to take medications at a certain time would be an acceptable use as it is a generic request. However, ordering a prescription for your patient through the service would be a violation, since personal information such as name, prescription, and home address would need to be provided. Asking Google to look up the definition of sphenopalatine ganglioneuralgia is acceptable; however, setting a reminder to tell patient Jane at her appointment that her headaches are caused by eating ice cream too quickly would be a HIPAA violation as this act would be synonymous with leaving a handwritten note with the same information out in your office lobby for anyone with a hint of curiosity to read.

While you may be tempted to bring your shiny new Amazon Echo, Google Home, or HomePod to your office, you would be well advised to keep your smart speaker at home for the time being until virtual assistance becomes well acquainted with HIPAA. Until then, try to stay content with asking your device to play “Dear Doctor” by the Rolling Stones to pass the time.

Kevin Peek is an associate in the St. Louis office of Sandberg Phoenix focusing on cases involving medical malpractice defense and defense of providers in correctional healthcare.

Kyle Haubrich, JD, counsel at Sandberg Phoenix in St. Louis also contributed to this article.

Related Videos
Dermasensor