Skip to Main Content

As a health policy wonk and health economist who has worked in pharmaceutical companies in the United States, Latin America, and Europe, I’ve seen vast volumes of data generated, gathered, aggregated, analyzed, shared, and resold by health care companies and organizations. In my studies with the world’s top medical statistics experts at the University of Oxford’s Centre for Evidence Based Medicine, I’ve also seen how flawed many datasets are, missing critical data pieces and definitions, and yet still used by the industry to make key decisions.

I’m now more certain than ever that patients are being seriously exploited in terms of their data, its value, and the profitability others are deriving from its aggregation and sale — though some are beginning to realize just how valuable their health data can be. They should be able to bank that value.

advertisement

One encouraging sign has been the 21st Century Cures Act, which gives Americans the right to free, digital access to their health records, and further regulates how health data are used. Another came recently when the Federal Trade Commission — finally — enforced the Health Breach Notification Rule for the first time since it became law in 2009. The FTC ruled that GoodRx, a popular online destination for finding the best price for prescriptions, broke promises to users about not sharing their data with Facebook, Google, and other third parties for advertising purposes, monetized its users’ data without their consent, and committed other violations of the Health Breach Notification Rule. The company agreed to pay a $1.5 million civil penalty.

Putting teeth into the Health Breach Notification Rule sends a warning signal to companies that aren’t being honest about the use of valuable health data. The rule requires businesses not covered by the more well-known (but often misunderstood) Health Insurance Portability and Accountability Act of 1996 (HIPAA) to notify customers if there is a breach of individually identifiable electronic health information.

The GoodRx story confirms what industry insiders have known for years: that sharing health data is rampant among health care providers, the financial and IT companies that support them, and the tech companies that are increasingly blurring the line between health, consumer, and advertising data. In many cases, such data sharing is not only enabled by existing regulations, including HIPAA, but is facilitated, helping businesses share data to adjudicate insurance claims and make payments, for example.

advertisement

The U.S. approach to data sharing stands in stark contrast with standards now in place in other parts of the world, including Europe’s General Data Protection Regulation (GDPR). More importantly, it goes against what people say they want, which is to freely share their data for scientific or research purposes that help others, but also to share in the upside when their data are highly valuable, such as when they help lead to the discovery of a multibillion-dollar drug.

Patient data as the new oil

It’s been more than 15 years since the first declaration that “data is the new oil.” But policymakers are still trying to figure out the right set of rules to make sure this “new oil” can be handled fairly.

The better the properties and potential of digital data are understood, the more it might behave like something even more valuable than oil. Data are starting to look more and more like a form of money itself — a way to exchange and transact, a way to represent work completed, and a medium that can be traded, sold, or shared. Data may be thousands of times more valuable than oil, particularly when they are highly specific and personalized, and can be leveraged by multibillion-dollar industries.

But unlike money, no global infrastructure exists to govern, manage, and watch data transactions, and no collective institutions exist to protect data security or integrity. Health care data in particular exist in a wild west environment governed by permissive regulations that enable data sharing and transfer among entrenched interests rather than protecting individual’s privacy.

Today’s system allows data monopolies to operate unfettered and reap profits while data laborers — in this case, patients and health care professionals — receive no compensation for their essential contributions. This status quo is unacceptable, antiquated, and exploitative. Yet the health care landscape is dotted with companies whose only value proposition is aggregating and reselling data created by patients and doctors.

Roche bought Flatiron, for example, for $1.9 billion in 2018 based on the projected future success of this business model. Cerner, Epic, McKesson, and UnitedHealth/Optum, as well as smaller companies like IQVIA, Komodo, and Symphony Health, do the same. These companies, furthermore, mix and match datasets to glean more information — health and financial information for the same anonymized patient across numerous providers — in ways that don’t violate HIPAA but fall far short of truly protecting privacy. Re-identification in these contexts is a larger risk than most individuals, or companies, may realize.

What to do

Two things need to happen to empower patient privacy and economics when it comes to health data.

First, patients, health care professionals, and consumers in general need to know and understand more about how health companies and others are sharing and using their data.

Second, the health data economy must become more equitable. When Flatiron was bought for $1.9 billion, how much of that value had been generated by cancer patients whose data was Flatiron’s business? A lot — yet they received nothing from this transaction. They may even have received negative value, with some struggling to pay for their cancer treatments while Flatiron sold their data to Roche, one of the world’s largest makers of cancer treatments.

The situation recalls the case of Henrietta Lacks, the Black woman whose cancer cells became a cornerstone for research in the early 1950s but who received no compensation for her contribution to medical science, only posthumous recognition.

As the Health Breach Notification Rule, the 21st Century Cures Act, and apps like Epic’s MyChart draw attention to how much data individuals produce, people will demand accountability, transparency, and, eventually, compensation. Once people figure out their information is being used for profit, they will want a share. People are happy to share data if they think it provides health benefits to themselves or others. Their altruism stops, though, when they feel that companies are exploiting them, or profiting from data sales on one hand while sending them hefty bills on the other.

Banking data

One solution to consider is following the logical conclusions that arise when data and value are thought about in the same way as money and banking. While banks keep their customers’ money safe and pay interest on it, they find other productive uses for it while it’s under their care. People can move money between different banks, too, if they’re unhappy about the service they’re getting or worried about a security track record.

We need the same institutions and functionality for health data.

New technologies make this vision possible, helping ensure ethical and responsible data utilization as well as distributing profits not only to data brokers and curators but also to data generators. A lot can be learned from financial institutions on how to implement this model; the framework is there, and the need is stronger than ever.

Jennifer Hinkel is the managing director and chief growth officer of the Data Economics Company, a Los Angeles-based company developing the economic and technology frameworks for private data vaults that people and companies use to package and manage their data as digital assets.

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page.