On January 7, 2026, OpenAI announced the introduction of ChatGPT Health, a new section within ChatGPT focused specifically on health and wellness interactions.
The company described the feature as a way for users to organize health-related conversations separately from other chats and, if they choose, connect medical records and wellness apps such as Apple Health and MyFitnessPal to provide additional context for responses.
According to OpenAI, the system is intended to help users understand information, prepare for medical appointments, and track general patterns in their health, while not serving as a tool for diagnosis or treatment.
OpenAI also stated that physicians were consulted during development and that the system is evaluated using internal clinical-quality benchmarks. Access to ChatGPT Health is being introduced gradually through a waitlist, with initial availability limited by region and account type.
See also: How ChatGPT can support HIPAA compliant healthcare communication
ChatGPT Health, as a consumer-oriented product, is not inherently HIPAA compliant for handling protected health information (PHI). The clause Section 5.4 states that customers may not use OpenAI services to create, receive, maintain, transmit, or otherwise process PHI unless they have signed a Healthcare Addendum.
This includes a Business Associate Agreement (BAA) required under HIPAA. Without such an addendum, processing PHI through any service not specifically designed or contracted for that purpose is prohibited.
The service agreement expressly states, “Customer agrees not to use the Services to create, receive, maintain, transmit, or otherwise process Protected Health Information, unless it has signed the Healthcare Addendum.”
Andrew Crawford from the Center of Democracy and Technology noted in a statement to the BBC, “Since it's up to each company to set the rules for how health data is collected, used, shared, and stored, inadequate data protections and policies can put sensitive health information in real danger.”
No, AI systems are not automatically HIPAA compliant.
It is a risk if sensitive data is stored, reused, or learned from without the proper safeguards, consent, and legal protections required under HIPAA.
If it is willing to sign a BAA and exhibits compliance with HIPAA's Rules.