Skip to the main content.
Contact Sign in Start for free
Contact Sign in Start for free

4 min read

The limits to using Siri in healthcare

The limits to using Siri in healthcare

A 2022 Frontiers in Public Health feasibility study demonstrated that voice assistants like Siri could assist caregivers in tracking symptoms and health events at home, such as logging medications or patient behaviors via voice notes, but clinical adoption remains rare. Barriers include Siri’s inability to comply with HIPAA standards, as Apple does not sign business associate agreements (BAAs) to safeguard protected health information (PHI). 

This creates risks of data exposure during voice interactions or cloud processing. AI systems like Siri struggle with unstructured medical data, such as imaging or nuanced symptom descriptions, limiting their diagnostic utility. Errors in voice recognition or recommendations could lead to misdiagnosis, yet accountability frameworks for AI-driven mistakes in healthcare are undefined. 

For example, benign moles could be misclassified as malignant due to adversarial noise in data inputs, highlighting reliability concerns. These limitations are compounded by technological gaps, including insufficient standardization of medical data formats and lack of transparency in AI decision-making processes. 

While exploratory applications show promise in non-clinical tasks (e.g., administrative reminders), healthcare organizations avoid deploying Siri for PHI-related workflows due to these unresolved risks.

 

What Siri can and cannot do 

Siri can support non-clinical tasks, but cannot function as a standalone wellness or health app. Its capabilities include tracking symptoms, medications, and lifestyle metrics via voice notes, as evidenced by a study where caregivers used a Siri-integrated app to monitor pediatric health events. 

In a pilot dubbed the Wellness App, voice reminders improved diet and activity indices among users, with metrics like the Mediterranean Adequacy Index (MAI) and Target Calorie Index (TCI) showing gains. According to the Sensors MDPI study that assessed the pilot studies' results, “A key benefit of this technology is the possibility of continuously monitoring vital and physiological signs without obstructing the comfort of the user when performing her/his daily activities.”

However, Siri’s limitations are substantial: it cannot securely process PHI or comply with HIPAA, rendering it unsuitable for clinical decision-making. Studies note its inconsistent accuracy for medical queries, such as providing USPSTF-aligned cancer screening guidelines in only 70% of cases, often relying on non-peer-reviewed sources like WebMD. 

Siri lacks contextual awareness, struggling to interpret nuanced symptoms or follow-up questions. For instance, AI systems have erroneously advised discharging asthmatic pneumonia patients due to flawed training data.

 

How Siri is used by healthcare professionals 

Healthcare professionals have explored Siri primarily for caregiver support and administrative tasks. In the above mentioned Frontiers in Public Health study, 80% of caregivers reported improved efficiency when using a Siri-based app to log pediatric symptoms like fever or behavioral changes. 

Systematic reviews reveal experimental uses in chronic disease management, such as diabetes coaching and heart failure monitoring, though these interventions lack regulatory approval and clinical validation. A Healthcare (Basel) trial study titled ‘Exploring the Role of Voice Assistants in Managing Noncommunicable Diseases: A Systematic Review on Clinical, Behavioral Outcomes, Quality of Life, and User Experiences’ notes, “While VAs [voice assistants] demonstrated good usability and moderate adherence, their clinical and quality-of-life outcomes were modest…VAs show potential as supportive tools in NCD [non-chronic disorder] management, especially for enhancing patient engagement and self-management.”

The trial showed that a voice assistant provided medication reminders and emotional support for diabetes patients, but speech recognition errors and accessibility barriers hindered consistent use.

 

Is Siri HIPAA compliant? 

Siri is not HIPAA compliant, as Apple does not provide BAAs or ensure end-to-end encryption for health data. Voice interactions involving PHI risk exposure during cloud processing, where third-party servers could intercept sensitive information. For example, a physician dictating patient notes via Siri might inadvertently transmit PHI to unsecured servers, violating HIPAA’s data protection requirements.

 

The role of FTC oversight 

According to a detailed analysis of FTC enforcement activities, the agency’s authority is grounded in consumer protection laws that prohibit deceptive advertising and unfair business practices. The FTC has actively pursued enforcement actions against mHealth app developers who make unsubstantiated health claims or engage in deceptive marketing, such as the cases involving dermatological apps MelApp and Mole Detective, where the FTC required substantiation of health claims through rigorous scientific evidence including randomized controlled trials.

The FTC’s standard for deception does not require proof of intent to deceive; rather, it focuses on whether a reasonable consumer is likely to be misled by a material representation or omission. This means that if a company, for example, markets Siri or a Siri-integrated app as HIPAA-compliant or clinically validated without meeting those standards, the FTC could intervene to stop such deceptive claims.

 

The privacy concerns that come with the use of Siri 

In 2019, it was revealed that contractors working for Apple had access to audio recordings from Siri, including highly sensitive and private conversations, as part of a quality control process known as grading. These recordings were sometimes triggered accidentally without users’ knowledge, leading to fears of surveillance and privacy violations. 

Although Apple responded by suspending the program, updating its privacy policies, and allowing users to opt out of data sharing, the incident highlighted the broader issue of how voice assistants handle and secure personal data. There are still concerns about whether Siri’s voice data is adequately protected against misuse or breaches, especially if accessed or stored improperly.

 

The safe applications of Siri in healthcare practices 

Despite its limitations, Siri can be safely employed in healthcare organizations for specific non-PHI workflows that support patient engagement and wellness promotion without exposing sensitive data. Voice assistants like Siri have been used to provide general health education and chronic disease management tips without referencing individual patient information. 

These uses align with the preventive care model, where AI tools support health promotion and self-management in a scalable manner. The literature suggests that these applications can enhance patient engagement and adherence to wellness goals, provided that the voice assistant is not used for clinical decision-making or processing of sensitive health data.

 

Why Siri cannot be used in healthcare

  • AI systems like Siri can inadvertently expose confidential health data, violating medical ethics and legal requirements such as HIPAA. Third-party contractors may also listen to recordings to improve natural language understanding, increasing privacy risks. (Journal of Medical Internet Research, 2021 Apr 9, 23(4))
  • When AI systems like Siri provide incorrect medical information, it is unclear who bears responsibility for adverse outcomes. This creates ethical and legal challenges in clinical settings, complicating liability and governance. (Pan African Medical Journal, 2022 Sept 2; 43)
  • Conversational assistants provide incorrect or incomplete medical information frequently (error rates between 8% and 86%), undermining trust and reliability for clinical use. Their responses often lack alignment with established medical guidelines. (Journal of Medical Internet Research, 2018 Sep;20(9))

Related: HIPAA Compliant Email: The Definitive Guide (2025 Update)

 

FAQs

What is the difference between health assistance and clinical healthcare delivery?

Health assistance generally refers to supportive, non-clinical services that help individuals maintain or improve their well-being whereas clinical healthcare delivery involves direct medical care provided by trained health professionals to diagnose, treat, and manage diseases or injuries.

 

Are voice assistants compliant with healthcare privacy regulations like HIPAA?

Most consumer voice assistants are not fully compliant with HIPAA, the US regulation protecting patient health information. Some platforms, such as Amazon Alexa, have developed HIPAA-compliant skills for limited healthcare applications, but widespread adoption is hindered by privacy concerns.

 

Which types of wellness apps typically require HIPAA compliance?

Apps that collect biometric data, health metrics, or other personal health information and share it with healthcare providers, insurers, or health plans are subject to HIPAA.

 

Are all health and wellness apps required to be HIPAA compliant?

No. HIPAA does not generally apply to standalone health and wellness apps that do not handle PHI on behalf of covered entities. For example, popular fitness trackers like Fitbit or MyFitnessPal typically do not fall under HIPAA because they operate independently and do not share data with healthcare providers.

Subscribe to Paubox Weekly

Every Friday we'll bring you the most important news from Paubox. Our aim is to make you smarter, faster.