Many healthcare professionals are quick to adopt new technologies in their day-to-day personal lives. A typical morning could begin by asking Alexa or Google for the day’s news and weather followed by an audio synopsis of the day’s schedule and unread email. Smart speakers are rapidly becoming more powerful and pervasive.
Following this conversation with a smart speaker would be a workout measured and dictated by one’s Apple Watch or Fitbit. Based on the World Health Organization’s recommended levels of physical activity, wearable tech encourages behavioral change intervention. Everything is monitored, analyzed and prescribed in real-time. No one is questioning whether you can keep up with these new technologies, but instead, whether the technologies can keep pace with you.
In hospitals, however, it’s an entirely different story. In the United States, healthcare services are dictated by HIPAA (Health Insurance Portability and Accountability Act of 1996). HIPAA, among other things, requires the protection and confidential handling of protected health information. The same technology that vastly improves the lives of healthcare professionals on their own time is hindered by healthcare privacy regulations legislated over twenty years ago.
As smart technologies enter the field of healthcare, HIPAA compliance and enforcement officials are working to scale the act to include this new technology. Is wearable tech permissible? Is electronic protected health information (ePHI) at risk when smart speakers are utilized?
With wearables, it’s complicated.
HIPAA, of course, applies to covered entities — like providers and insurers — and business associates, meaning vendors. The rise of wearable technology has raised questions as devices such as smartwatches have begun collecting more health data, sometimes for clinical use. But data gathered via wearables don’t always fall under HIPAA security guidelines.
For example, if a person buys a Fitbit and then uses it to track information like number of steps taken per day, calories consumed and heart rate, the data isn’t protected under HIPAA. Why? This equation lacks a covered entity or business associate.
But consider this: At the direction of a healthcare provider, a patient downloads a smartwatch app that monitors health data points that are then integrated into an electronic health record. The app developer or marketer, meanwhile, is receiving money from the provider for the digital service. In that case, the developer is generating, collecting, storing and sharing data on behalf of a covered entity — and, as a business associate, it must abide by HIPAA.
With artificial intelligence voice assistants, there’s more clarity. There’s no doubt Alexa, Siri et al., are not yet complying with HIPAA in this area. At least not yet.
What next?
Virtual assistants must first be taught (i.e., programmed) to avoid mistakes and abuse related to healthcare. For example, if hospitals utilize Alexa to draft hospital notes and include the ability to make orders for procedures or medications, hospital procedures would need to be implemented to prevent anyone who is not a physician from walking into someone’s room and creating an order. Also, if the smart speaker incorrectly “hears” the name of a medication and places an order for the wrong one, that would create obvious issues. Once the technology is more advanced and protections are in place, it will be up to hospitals to properly implement the voice-activated technology into the healthcare system.
In the near future, we could have our genetic code on our smart phone, enabling the prescription of personalized treatment in real-time. This is but one example of the tremendous power of new technologies in this digital age.
Health care systems seek aid from technologies to help them find efficiencies in their workflows as well as trend their data better.
At Palooza, we announced that we are exploring being able to leverage Machine Learning and Natural Language Processing to help hospitals capture their patient feedback better as well as provide better insights into the data they collect. The Research & Innovation team completed a 2 month NLP research project over the summer. We developed a working POC example utilizing NLP and event severity, HCAHPS. Overall the research was very positive in the results which were found and the practicality of us utilizing the technology.
As these conversations become more prominent in the industry, the use of these technologies in relation to HIPPA compliance comes into question. We now have to be mindful of how technologies that utilize the power of machine learning affect compliance with an act written in an era of 28.8 kbps dial-up internet, Netscape Navigator and AltaVista.