HIPAA, enacted in 1996, was designed to protect the privacy and security of certain health information. But the law does not apply to all health data, it applies specifically to covered entities and their business associates. Covered entities are healthcare providers, health plans, and healthcare clearinghouses. Business associates are the third-party vendors who handle protected health information on their behalf.
Companies like Apple, Google, and Fitbit's parent Alphabet are not healthcare providers. They are technology companies. Unless they are operating under a specific business associate agreement with a covered entity, the health data they collect falls outside HIPAA's jurisdiction entirely, so fitness trackers are not HIPAA compliant.
This is examined in detail in Wearing Down HIPAA: How Wearable Technologies Erode Privacy Protections, a law review article published in the Journal of Corporation Law. Author John T. Katuska observes that none of the major wearable device manufacturers qualify as covered entities under the current framework, because they are not health plans, healthcare clearinghouses, or healthcare providers. They are technology companies that happen to collect clinical health data, and the law does not reach them.
Katie Bunch, an Associate Member of the University of Cincinnati Law Review, states that identical health information may be fully protected when it sits inside a hospital system, yet entirely unprotected when the same information is collected by a smartwatch or mobile application.
The ‘consumer app’ loophole
Legal scholar Anna Mizzi, writing in "Profiting on Your Pulse: Modernizing HIPAA to Regulate Companies' Use of Patient-Consumer Health Information", identifies a category of companies at the center of this problem, consumer health interactive analysis companies, or CHIACs. Their model involves receiving raw health data from users, analyzing it, and returning insights in a continuous cycle. Mizzi argues that this ongoing, intimate relationship between company and user generates clinically significant health profiles, yet these companies operate almost entirely outside HIPAA's reach.
The Katuska law review article reinforces this point through explaining what it means to be a business associate. Whether a company qualifies is entirely dependent on its specific relationships with covered entities. If a physician contracts directly with a wearable device company to receive patient data, the company becomes a business associate and HIPAA applies. But if a physician simply recommends a Fitbit to a patient and the patient buys one independently, Fitbit is acting on behalf of the consumer and HIPAA does not apply. As the law review article puts it, the company would not "create, receive, maintain, or transmit protected health information on behalf of a covered entity," and therefore falls outside the law.
The Federal Trade Commission has stepped into some of this space. As the FTC notes in its compliance guidance on the Health Breach Notification Rule, "many companies that collect people's health information - whether it's a fitness tracker, a diet app, a connected blood pressure cuff, or something else - aren't covered by HIPAA. Does that mean this sensitive health information doesn't have any legal protections? Not at all." The FTC enforces the Health Breach Notification Rule, which requires certain health app companies to notify users if their personal health records are disclosed without authorization. In 2021, the FTC clarified that the rule applies to health apps and connected devices that collect consumer health data.
The FTC's own guidance shows how broadly the rule applies, "if you develop a health app that collects information from consumers and can sync with a consumer's fitness tracker, you're probably a vendor of personal health records." That category triggers notification obligations, but not the deeper privacy and data-use protections that HIPAA provides.
In practice, this means your fitness tracker company may be within its legal rights to share your anonymized health data with insurers, employers, marketers, and researchers. The legality is based on the terms you agreed to when you clicked "I Accept" during setup.
The FDA gap
In Smart Wearables: The Overlooked and Underrated Essential Worker, Rebekah Hill argues that wearable devices are already functioning as medical devices in all but name, yet manufacturers sidestep oversight using strategic marketing language.
Under the FDA's existing definition, Section 201(h)(1)(B), a medical device is one "intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease, in man." The operative word, Hill argues, is "intended" and manufacturers have exploited it. By labeling devices as tools for "general wellness" rather than diagnosis or treatment, companies can avoid the FDA approval process, even when their products perform functions that would otherwise trigger regulation. Hill's analyzed Apple Watch, the watch's ECG feature obtained FDA clearance as a Class II medical device, because Apple promoted it as detecting atrial fibrillation. However, the watch's pulse oximeter feature, which measures blood oxygen saturation on demand, just as a clinical pulse oximeter does, did not receive such clearance. This was because Apple chose to market it as a wellness tool rather than a diagnostic one.
Hill proposes replacing the "intended use" standard with an objective "capable of" test, this means a device would be regulated based on what it can actually do, not what its manufacturer claims it's for. This single change, she argues, would close the loophole that currently allows companies to self-exempt from oversight by adding a wellness disclaimer to otherwise clinical technology.
Hill also speaks on the role smart wearables played during the COVID-19 pandemic as further evidence that these devices already function as medical instruments. Wearable-based heart rate, step, and sleep data were used in research to detect presymptomatic COVID-19 cases.
What the data is actually worth and who's buying it
The Katuska law review article notes the danger, stating that, health information that would be protected as electronic protected health information (ePHI) if held by a hospital or insurer is subject to no equivalent protection when held by a consumer wearable company. Without the administrative, physical, and technical safeguards required by HIPAA's Security Rule, that data is more vulnerable to theft, sale, or exploitation. Katuska cites the Department of Health and Human Services' own framing of what can go wrong when health information lacks protection, which is that it can be passed on to lenders who deny credit applications, or to employers who use it in personnel decisions.
According to Mizzi, many of these companies sell user data to third-party aggregators, often without users' knowledge, who then compile detailed health profiles that can be resold across industries including insurance and employment. One data aggregator she cites describes its own role as data is being collected, connected, compiled, and sold every minute of every day. A 2023 study found that wearable-generated data could be used to infer users' chronic conditions with meaningful accuracy, even when only indirect biometric signals were analyzed.
A 2014 FTC study, cited by Bunch, found that more than 12 mobile health applications and devices transmitted user health information to 76 separate third parties, including data that could be traced back to specific individuals. Eighteen of those third parties received device-specific identifiers, and 22 had access to additional health information.
Mizzi also shows the risk to individual safety. Without HIPAA's encryption and disclosure standards applying to these companies, sensitive health data is more vulnerable to breaches, black market sale, and exploitation. One case she documents involved a woman who nearly lost custody of her children after someone stole her medical identity and used it to check into a hospital.
The stakes in data breach cases are well documented in the courts. Bunch points to Clemens v. ExecuPharm Inc., in which a court recognized that the unauthorized exposure of personal health information created a risk of identity theft and fraud with plaintiffs suffering emotional distress, therapy costs, and time and money spent mitigating the damage. Other cases she cites show plaintiffs raising concerns about misuse of their health data, violations of privacy, and the diminished value of their personal information once exposed.
The workplace dimension
Mizzi draws attention to corporate wellness programs that incentivize or require employees to wear fitness trackers and log health data. Without clear HIPAA protections applying to these arrangements, she argues, the line between voluntary participation and coercion can blur especially when financial incentives are substantial or when participation data is displayed publicly, such as on workplace leaderboards. Data aggregators can sell health profiles to employers or prospective employers as to insurers, and existing laws like the Americans with Disabilities Act were not designed to address this specific risk.
Bunch raises similar concerns, noting that the current legal framework leaves open the possibility that health data collected through workplace wellness programs could flow to employers or insurers without consumers' meaningful knowledge or consent.
Hill's analysis in Smart Wearables: The Overlooked and Underrated Essential Worker adds to this concern. She points to the SMARTWATCH Data Act, a bipartisan Senate bill, as evidence of growing legislative recognition that wearable devices collect personal health information equivalent in sensitivity to what HIPAA protects in clinical settings. The bill's drafters framed it as extending existing healthcare privacy protections to data collected by wearables.
The case for reform
Mizzi argues that the most practical path forward is for the Department of Health and Human Services to reinterpret HIPAA's existing definition of "covered entities" to include companies like Fitbit and Flo Health. Because HHS has historically interpreted the definition of "health care provider" broadly, and has explicitly left room to expand that interpretation in response to technological change, she contends this would not require new legislation, only updated regulatory guidance.
Bunch approaches the same problem from a different angle, proposing that HIPAA could be revised to protect health information based on the nature of the data itself, rather than the identity of whoever collects it. Under such a framework, coverage would depend on whether data reveals information about a person's physical or mental health, medical conditions, treatments, or biological metrics and not on whether the collecting entity qualifies as a traditional healthcare provider. She argues the law could distinguish between varying levels of sensitivity, imposing stricter requirements on biometric or especially identifying data.
The Katuska law review article offers a third path, focused on clarity and practicality. Rather than reinterpreting existing definitions or restructuring protections around data type, Katuska proposes adding a fourth category to the definition of covered entities, that is, companies that produce devices whose primary purpose is achieved through collecting health information from individuals. This "primary purpose" test is already well-established in other areas of law, he argues, making it workable for courts and regulators to apply. Under this framework, a Fitbit or Apple Watch would trigger HIPAA coverage not because of its relationship to a healthcare provider, but because collecting health data is central to what the device does. Katuska argues this approach would benefit all parties, giving companies clear compliance guidance, giving consumers enforceable protections, and closing the gap between what the law was designed to achieve and what it actually covers.
Hill, writing from the FDA angle, offers a fourth path that complements the HIPAA reform proposals. Her proposed amendment to the FDA's "medical device" definition, substituting "capable of" for "intended for use in", would make it difficult for a manufacturer to self-exempt a clinically capable device simply by relabeling it as a wellness product. Importantly, Hill notes that this change would benefit manufacturers as well as consumers as clear regulatory guidance removes the current ambiguity that forces companies to guess whether their device requires FDA oversight.
Beyond scope, Bunch calls for specific requirements governing how wearable health data is collected, stored, and shared. This includes data security mandates, regular risk assessments, and limits on data retention to what is functionally necessary. She also argues for stronger transparency obligations, requiring companies to provide clear, detailed notices about what data is collected, how it will be used, and which third parties may have access.
The consent problem
One area where Bunch's analysis adds weight to the reform conversation is consumer consent. She argues that vague, buried consent language is not sufficient when the data at issue includes heart rate patterns, sleep cycles, and other intimate biometric information. Consent, she contends, should be specific, segmented, and renewed. Which allows users to choose which categories of data they are willing to share and for which purposes, with companies required to obtain consent whenever new data types are introduced.
She also advocates for accessible dashboards that give users ongoing visibility into and control over their data-sharing arrangements, including the ability to revoke consent at any time.
The Katuska law review article adds to this concern from the company side. As long as it is unclear whether HIPAA applies to a given wearable manufacturer, companies face a choice between spending capital implementing HIPAA compliant procedures they may not legally need, or forgo those procedures and risk fines if regulators later determine that they did. That ambiguity, the law review article argues, discourages investment and innovation while leaving consumers no better protected.
The upcoming regulations
Several US states including Washington, California, and Connecticut have enacted or proposed health data privacy laws that extend protections beyond HIPAA's covered entity framework. Washington's My Health MY Data Act, which took effect in 2024, is among the broadest, applying to any company that handles health data about Washington residents regardless of whether they are a covered entity.
Bunch notes that these state frameworks as concrete models worth building on. Washington's statute focuses on transparency and consent. Connecticut limits how sensitive information can be handled. California's Consumer Privacy Act grants individuals explicit rights to know what is being collected, to delete it, to opt out of its sale, and to avoid discrimination for exercising those rights. Bunch argues that a comprehensive federal framework should incorporate and expand all three approaches, combining transparency requirements, limits on data use, and strong consumer rights into a uniform national standard.
At the federal level, the FTC has been updating its enforcement posture to keep pace. Its July 2024 amendments to the Health Breach Notification Rule made explicit what was previously implied. As the FTC put it in its own compliance guidance, "Since the FTC first issued the Rule more than a decade ago, consumers have turned to apps, wearables, and other technologies for health advice, information, and tracking. It is imperative that the FTC's enforcement of its Rule keep pace with changing technology." Companies that fall short face consequences, the FTC's guidance confirms that violations can result in civil penalties of up to $53,088 per violation.
FAQs
Can I opt out of data sharing after I've already agreed?
Most companies allow you to delete your account, but there is no federal law guaranteeing that your previously shared data will be recalled or destroyed.
Is my data safer with a smaller, lesser-known fitness app?
Smaller companies may actually pose greater risks, as they often have fewer security resources.
What should I do if I suspect my health data has been shared without my consent?
You can file a complaint with the FTC, but your legal options remain limited unless you live in a state with specific health data privacy protections.
Subscribe to Paubox Weekly
Every Friday we bring you the most important news from Paubox. Our aim is to make you smarter, faster.
