5 min read

When does AI become a business associate under HIPAA?

When does AI become a business associate under HIPAA?

Under HIPAA, a business associate is defined in 45 CFR 160.103(1)(i) as a person or entity that, on behalf of a covered entity, "creates, receives, maintains, or transmits protected health information for a function or activity regulated by this subchapter."

The criteria to determine whether an AI system is a business association is whether the system or its provider creates, receives, maintains, or transmits PHI on behalf of a covered entity. If the answer is yes, a business associate agreement (BAA) is required. However, AI technology makes this determination trickier than with regular vendors.

As noted in Regulatory Aspects of Artificial Intelligence and Machine Learning, "business associates dealing with AI solutions and AI systems handling PHI must comply with the same privacy and security standards as human operators." This statement explains that AI technology does not create exceptions to HIPAA's requirements, the requirements apply equally whether data is processed by human staff or AI algorithms.

Read also: What is the purpose of a business associate agreement?

 

When AI requires a BAA

The regulation at 45 CFR 160.103(1)(i) specifically lists business associate functions including "claims processing or administration, data analysis, processing or administration, utilization review, quality assurance, patient safety activities... billing, benefit management, practice management, and repricing." Many AI applications fall within these categories.

For example, if you're using an AI-powered transcription service to convert patient-clinician conversations into medical records, that vendor is handling PHI and needs a BAA. Similarly, AI diagnostic tools that analyze patient imaging or lab results, AI chatbots that interact with patients about their health conditions, or AI platforms that process claims data all involve direct PHI access.

According to Regulatory Aspects of Artificial Intelligence and Machine Learning, when patient data is pushed to the cloud for AI algorithm analysis, this process must ensure that PHI is protected through measures such as encryption, access control, and deidentification. This requirement applies regardless of where the cloud servers are located or how the AI processing may be.

As Delaram Rezaeikhonakdar notes in AI Chatbots and Challenges of HIPAA Compliance for AI Developers and Vendors, when hospitals or physicians input patient health data into AI chat tools for purposes such as responding to medical questions, generating patient letters, creating medical summaries, or producing clinical and discharge notes, those AI vendors become business associates under HIPAA and must comply.

45 CFR 160.103(1)(ii) also covers entities that provide "legal, actuarial, accounting, consulting, data aggregation... management, administrative, accreditation, or financial services" where "the provision of the service involves the disclosure of protected health information." AI tools providing these services to covered entities would require BAAs.

The main factor is that the AI system is processing identifiable patient information as part of its core function. Under HIPAA, protected health information means individually identifiable health information that is transmitted by electronic media or maintained in electronic media.

Related: Are AI assistants business associates?

 

The gray areas

Complications arise with AI tools that touch PHI indirectly or where the nature of data access is not clear. For instance, an AI system that analyzes aggregate, de-identified data to identify population health trends. If the data is truly de-identified according to HIPAA's standards, the AI vendor is not a business associate. However, de-identified data must have all 18 HIPAA identifiers removed and pose no reasonable risk of re-identification.

Rezaeikhonakdar notes a concern about re-identification, especially when technology companies like Meta, Google, and Microsoft have access to personal information and integrate AI into their services. This phenomenon, known as data triangulation, can compromise datasets that were de-identified using the Safe Harbor method. The risk becomes more when these companies integrate generative AI into their platforms or require users to rely on their services to access AI tools.

Another gray area involves AI tools embedded within existing software. If your electronic health record system incorporates AI features developed by a third party, you need to determine whether that AI developer has access to PHI or whether the AI operates within the EHR vendor's environment. 45 CFR 160.103(3)(iii) clarifies that "business associate includes: A subcontractor that creates, receives, maintains, or transmits protected health information on behalf of the business associate." This means that if the third-party AI developer is a subcontractor to your EHR vendor and handles PHI, they are also considered a business associate and require appropriate agreements.

Learn more: Can de-identified data be used to train AI under HIPAA?

 

The conduit exception

HIPAA regulations address entities that merely transmit data, but the application to AI systems requires careful analysis. According to 45 CFR 160.103(3)(i), a "business associate includes: A Health Information Organization, E-prescribing Gateway, or other person that provides data transmission services with respect to protected health information to a covered entity and that requires access on a routine basis to such protected health information."

This means data transmission services are business associates if they require routine access to PHI. Some AI vendors may state that they qualify as mere conduits, claiming they simply transmit data without accessing it. However, true conduits provide temporary transport without viewing, using, or retaining data. Most AI applications require accessing and processing the data to function. An AI model analyzing medical images must "see" those images, which means it requires routine access to PHI, disqualifying it from conduit status under the regulation.

 

Considerations for compliance

Healthcare organizations should adopt a systematic approach to evaluating AI vendors. Start by mapping data flows: what data goes to the AI system, in what form, and what happens to it? Request technical documentation about how the AI processes data, where it's stored, and who can access it.

Ask vendors directly whether they consider themselves business associates and whether they'll sign a BAA. A vendor's refusal to sign a BAA isn't necessarily a red flag, they might legitimately not require one.

Review your organization's risk assessment. Even if a vendor might technically avoid business associate status through a data handling approach, consider whether accepting that arrangement creates privacy risks. Rezaeikhonakdar states that AI developers and vendors should treat health data in a way that complies not just with the letter of HIPAA but with its spirit and purpose.

 

The consequences of getting it wrong

Failing to obtain a required BAA can result in HIPAA violations for both the covered entity and the vendor. The Office for Civil Rights can impose civil monetary penalties, and violations involving willful neglect carry minimum fines. Beyond financial penalties, breaches involvingAI vendors can damage patient trust and your organization's reputation.

A 2018 case shows the consequences of neglecting BAA requirements. Advanced Care Hospitalists PL (ACH), a Florida contractor physicians' group, paid $500,000 to settle HIPAA violations after sharing protected health information with a medical billing vendor without executing a business associate agreement. The breach exposed the names, dates of birth, and social security numbers of over 9,000 patients on a public website.

OCR's investigation revealed that ACH had operated since 2005 without conducting a risk analysis, implementing security measures, or adopting any policy requiring business associate agreements until 2014. Then OCR Director Roger Severino stated, "This case is especially troubling because the practice allowed the names and social security numbers of thousands of its patients to be exposed on the internet after it failed to follow basic security requirements under HIPAA."

The case shows that BAA failures add onto other other compliance gaps and can lead to financial penalties and reputational damage, concerns that remain relevant today as healthcare organizations evaluate AI vendor relationships.

 

FAQs

Does HIPAA apply differently to generative AI than traditional machine-learning tools?

No, HIPAA applies based on whether PHI is created, received, maintained, or transmitted, not on the type of AI technology used.

 

Can an AI vendor be a business associate even if humans never view the data?

Yes, automated access to PHI by an AI system still constitutes access under HIPAA, even without human review.

 

Does encrypting PHI eliminate the need for a business associate agreement?

No, encryption is a safeguard but does not remove the requirement for a BAA if the vendor handles PHI.

 

What happens if an AI vendor subcontracts model hosting or inference to another company?

Any subcontractor that accesses PHI becomes a downstream business associate and must be covered by appropriate agreements.

 

Can a covered entity rely on an AI vendor’s claim that it is HIPAA compliant?

No, HIPAA compliance claims do not replace the legal requirement for a signed BAA where PHI is involved.

 

Subscribe to Paubox Weekly

Every Friday we bring you the most important news from Paubox. Our aim is to make you smarter, faster.