AmachHealth
All Insights
health dataprivacydata sovereigntyAIdecentralization

In Plain Sight: How Every Major Tech Company Quietly Decided Your Health Data Was Theirs

The consolidation of personal health data didn't require a hack or a breach. It happened in documents you could access, in press releases you didn't read, in earnings calls you weren't watching.

Amach HealthApril 30, 20258 min read

Billions of health records have been stolen through data breaches in the last decade. Hospital systems, insurance companies, genomics labs. Names, diagnoses, social security numbers, genetic profiles. The breaches made headlines. Congress held hearings. Settlements were paid.

But the breach isn't the point.

The more consequential transfer of your health data didn't require a hack. It didn't require a crime. It happened in documents you technically had access to, announced in press releases you didn't read, buried in earnings calls you weren't watching. The consolidation of personal health data into the hands of a small number of powerful entities is one of the most significant shifts in the history of medicine. And it happened in plain sight.

The Pattern Is Not a Coincidence

In 2023, Google completed its acquisition of Fitbit, adding over 120 million user health profiles to its existing data infrastructure. The same year, Microsoft deepened its clinical data partnerships through Nuance, processing hundreds of millions of patient records annually. Apple quietly expanded HealthKit's data sharing capabilities while keeping the underlying data proprietary to its own ecosystem. Amazon's One Medical acquisition gave it access to physical clinical records for the first time.

None of these moves were hidden. All of them were framed as improvements to your experience.

What they share is a direction. Data that once lived in isolated pockets — your doctor's office, your wearable, your phone — is flowing toward centralization. Not because of malice, but because of economics. Health data is extraordinarily valuable for training AI models, for predicting consumer behavior, for building products that anticipate needs before you name them. The entity with the most data builds the best models. The best models attract more users. More users generate more data. The cycle is self-reinforcing, and it has a name in physics: gravity.

Data, like matter, flows toward mass.

What the Policies Actually Say

Most people assume their health data is protected. HIPAA is often cited as the reason. But HIPAA governs covered entities — primarily hospitals and insurers — and it has significant gaps when it comes to consumer wellness platforms, AI companies, and the growing category of health tech that sits outside traditional medical infrastructure.

Consider what the policies of leading health platforms actually contain.

One prominent longevity platform's terms permit sharing anonymized data with third-party research partners, using aggregated data for product development, and training internal AI models on user health metrics. The opt-out mechanisms exist but require active navigation through settings most users never open.

A second platform uses the phrase "appropriate authorization" when describing when AI training on your data is permissible. The phrase is undefined. That ambiguity is not an oversight.

These aren't bad companies. They're operating rationally within a system that treats your health data as a resource to be refined rather than a possession to be protected. The extraction isn't hidden in the policies, it's just written in language calibrated not to alarm.

The AI Layer Changes the Stakes

The arrival of large language models trained on health data introduces a dimension that didn't exist five years ago. It's no longer just about knowing your resting heart rate or your sleep patterns. It's about what can be inferred from the combination of all of it.

AI models trained on population-scale health data can predict disease onset, insurance risk, fertility windows, cognitive decline trajectories, and psychological states with accuracy that would have seemed implausible a decade ago. The individual data point is relatively uninteresting. The pattern across millions of data points, correlated against outcomes, is extraordinarily powerful.

The companies building these models need training data. Consumer health platforms are one of the richest sources of longitudinal, behavioral, biometric data that exists. The incentive to acquire it, license it, or structure terms of service to permit its use is enormous. Several major AI labs have announced health data initiatives in the past eighteen months. The direction is not ambiguous.

What you generate becomes what trains the system. What trains the system serves the platform. The individual user is simultaneously the product's customer and its raw material.

Hiding in Plain Sight

The remarkable thing about this consolidation is how visible it has always been.

The acquisitions were reported by every major financial outlet. The policy language is publicly accessible. The AI training initiatives are announced in press releases. The business logic is straightforward enough that any analyst could trace it.

What makes it effectively invisible is not concealment. It's friction. Understanding what these policies mean requires reading documents designed not to be read. Connecting the dots across multiple companies and sectors requires time most people don't spend. And the framing — health improvement, personalized medicine, AI-powered wellness — is genuinely appealing. People hand over their data willingly because the value exchange feels reasonable in the moment.

The moment of exchange is the only moment most people ever think about it.

What Sovereignty Actually Means

Sovereignty over health data isn't a privacy preference. It's an architectural position.

The current architecture moves data from you to platforms. The platform stores it, analyzes it, trains on it, and decides what insights to return to you. You are a data source in someone else's infrastructure.

A sovereign architecture moves differently. Your data is generated by you, stored by you, and analyzed on your behalf. Insights come back to you without the underlying data ever leaving your control. Verification that you meet a health threshold — for insurance, for access, for research participation — can be proven mathematically without revealing the data behind the proof.

This isn't speculative technology. Zero-knowledge proofs, the cryptographic mechanism that makes this possible, are mature enough to run on mobile hardware. Decentralized storage that encrypts data such that not even the storage provider can read it exists today. The infrastructure for genuine health data sovereignty is not a future promise. It's an engineering choice.

Why It Matters Now

The window for individual health data sovereignty is narrowing, not because the technology is becoming less accessible, but because the consolidation is compounding. Every year that passes, more data flows into centralized systems. The models trained on that data become more accurate. The competitive advantage of having that data grows. The structural incentive to maintain the current architecture strengthens.

This is not a call to alarm. It's an observation about trajectory.

The current path leads to a small number of entities holding extraordinarily detailed models of human health, built from data generated by people who had limited understanding of what they were contributing. The alternative path requires choosing, deliberately and architecturally, to keep that data with the people who generate it.

That choice has been made. The work of building for it continues.


Amach is a decentralized health data platform built on the principle that your health data belongs to you. Not as a policy, as an architecture.

Ready to take ownership of your health data?

Get StartedMore Insights