Is Alexa HIPAA Compliant? What Healthcare Orgs Must Know

Alexa is not HIPAA compliant for handling protected health information. Amazon launched a HIPAA-eligible skills program in 2019, but ended support for it in December 2022. If you’re a healthcare provider, developer, or patient wondering whether you can use Alexa to transmit or store health data with HIPAA protections, the short answer is: not anymore.

What Amazon Offered (and Why It Stopped)

In April 2019, Amazon announced that select Alexa skills could transmit and receive protected health information (PHI) under HIPAA. This was an invite-only program where Amazon signed Business Associate Agreements with approved healthcare organizations, legally binding itself to the same data-handling rules that apply to hospitals and insurers. Six initial partners joined the program, including Express Scripts for prescription tracking, Cigna for health coaching, and Boston Children’s Hospital for post-surgical care instructions.

The program allowed patients to do things like check prescription status, schedule appointments, and receive care instructions through voice commands. But Modern Healthcare reported in December 2022 that Amazon ended support for this HIPAA-protected Alexa tool. The program is no longer active, and Amazon is not currently offering new Business Associate Agreements for Alexa health skills.

What the BAA Actually Covered

Even when the program was running, the protections were narrow. Amazon’s Business Associate Agreement applied only to interactions processed through an approved “Health Skill,” meaning a specially designated and Amazon-approved Alexa skill built by a covered entity or business associate. Everything outside that skill had zero HIPAA protection.

The BAA explicitly excluded several common scenarios: any Alexa skills that weren’t designated Health Skills, any general interactions a user had with Alexa outside the Health Skill, and any information a user directed Alexa to use elsewhere. So if a patient asked a Health Skill about a prescription and then told Alexa to create a calendar reminder with that information, the reminder fell outside the BAA’s scope. The HIPAA protections also did not cover data before it reached Amazon’s servers, meaning anything happening on the device or network before transmission was the user’s responsibility.

Standard Alexa and Health Data

A regular Alexa device in your home or office has no HIPAA protections whatsoever. When you ask Alexa about symptoms, medications, or anything health-related, that interaction is governed by Amazon’s standard consumer privacy policy, not by HIPAA. Amazon processes and stores voice recordings from these interactions under its general terms of service, which allow uses that HIPAA would prohibit for protected health information.

This distinction matters for healthcare practices. If a medical office uses a standard Echo device and a patient happens to share health details through it, there is no Business Associate Agreement in place and no HIPAA-compliant infrastructure handling that data. The healthcare provider could face liability for allowing PHI to flow through an unprotected channel.

Alexa in Hospitals Still Exists, With Limits

Some hospitals have deployed Alexa devices in patient rooms through a program called Alexa Smart Properties, often paired with third-party platforms like Aiva. BayCare, for example, installed Alexa-enabled smart rooms where patients can request a blanket, control the television, adjust lighting, or ask for entertainment like music and news updates. Patient requests are routed through the Aiva platform to the appropriate staff member.

These hospital deployments are designed around convenience and comfort features rather than clinical data exchange. Patients can ask for a channel change or a glass of water, but the system routes service requests to staff rather than transmitting or storing medical records. Patients who prefer not to use the system can mute the device. The important thing to understand is that these setups are carefully structured to avoid handling PHI through Alexa itself, sidestepping the HIPAA question rather than solving it.

What This Means for Healthcare Organizations

If you’re building a telehealth platform, patient communication tool, or any system that handles PHI, Alexa is not a viable channel with HIPAA protections. The program that once made this possible no longer exists. Organizations that had built HIPAA-eligible Alexa skills lost that infrastructure when Amazon pulled support in late 2022.

For healthcare developers looking at voice technology, this means evaluating alternatives or building custom solutions where you control the data pipeline end to end. Any voice assistant integration that touches PHI requires a signed Business Associate Agreement with the platform provider, and Amazon is not currently offering one for Alexa.

For individual users, asking Alexa general health questions (like “what are symptoms of the flu”) is no different from searching the internet. But sharing personal medical details, appointment information, or prescription data through a standard Alexa device offers no privacy protections beyond Amazon’s consumer terms. Your health information in that context is not treated as PHI under federal law.