Health care data are personal, plentiful and immensely valuable. In 2020, they made up an estimated 30% of the world’s annual data production, and with a greater adoption of wearables and other data-gathering tools, that number is on the rise. The uses of medical data have proved to be revolutionary, including algorithms that can better predict disease and produce a more accurate picture of long-term outcomes. A 2013 study by McKinsey estimated that smarter data-driven health care could save $300 billion in spending per year.

But health care data is also deeply personal. While some might agree to let their medical records or specimen data be used to further science, those who do not—or who are never given the option—might be aghast to see it used without consent to drive a company’s bottom line.

One proposed solution has come out of the European Union. While the United States has a sprawling patchwork of state and local laws around privacy, the EU has, for years, been passing sweeping and centralized legislation that emphasizes personal control. To unlock some of that data for research use, they have introduced the concept of “data altruism.”

Data altruism could work a little like ticking an organ donor box on a driver’s license. Patients agree to share their health data, selflessly and without compensation, but only for noncommercial purposes aimed at the greater good. The idea factors heavily into new legislation, the Data Governance Act, published by the European Commission in May 2022. Its architects hope that “data altruists” will increase the flow of data to researchers, including their health records.

Organizations that want to use the data have to be approved by the relevant oversight bodies. They must agree to use the data only for delineated scientific purposes, which must be conducted in the interest of the public good. Once altruists opt in, their health care data will flow into a central “pool,” per the DGA, where registered organizations can access it.

“Basically, it is supposed to help make data sharing—including for altruistic purposes—transparent,” says Mahsa Shabani, an attorney and data protection researcher at the University of Ghent in Belgium who studies how EU legislation may affect data sharing. The program also appears popular with the public: A survey published in 2017 of nearly 800 patients in Germany found that 87% were willing to share data for the right reasons.

Some critics have noted that the act could complicate the consent process and, ironically, add extra barriers to nonprofit research data use. In the short term, the European Data Protection Board and the European Data Protection Supervisor have warned about potential inconsistencies between the DGA and earlier privacy legislation, and during their review called for several changes, including a clearer definition of “purposes of general interest”—all of which may stall progress.

But more broadly, adding another layer of consent to the patient experience might deter potential altruists rather than free up their data, says Shabani. Existing EU law already has a strict consent process and the DGA doesn’t simplify that process. “The regulatory framework is already complex,” she says. “And this is yet another piece of the puzzle.”

And any nudge to citizens or organizations to become data altruists must also be accompanied by strong assurances that those data will be properly protected, says Kristin Kostick-Quenet, a bioethicist and medical anthropologist at the Baylor College of Medicine in Houston. She notes, for instance, that some of the strongest privacy-preserving technologies available to date, including encryption and decentralized learning approaches, are not widely employed by major entities involved in data exchange—a concern she would like to see addressed in future efforts.

Shabani says that until the EU regulation goes into effect, it’s hard to predict whether the idea will pay off. “We always talk about the importance of trust in the governance system, and for this system to be successful it needs to give the impression that the system has transparency. Citizens need to be able to feel that,” she says. Whether or not the DGA will succeed in building that trust remains to be seen. “People have to know their data will not be used in a way that will harm them.”