Home PublicationsCommentary The EU Should Not Exclude Private Firms From Voluntary Data-Sharing Frameworks for Social Good

The EU Should Not Exclude Private Firms From Voluntary Data-Sharing Frameworks for Social Good

by Hodan Omaar

The European Parliament will likely vote to approve the Data Governance Act (DGA) next week, a plan to strengthen data sharing mechanisms in the EU. In doing so, the EU will introduce a new conception for voluntary data sharing for social good: “data altruism.” The intention is good, but as defined, the impact will be lacking because it excludes for-profit organizations.

Data altruism encompasses two types of voluntary data-sharing for social good. First, it includes individuals consenting to share their personally identifiable information (PII) for research, such as the Personal Genome Project, which enables willing participants to publicly share their genome sequence and health data for use in scientific research. Second, it covers nonprofit organizations sharing non-PII about their users for socially beneficial purposes. An example is Entur, a government-owned transportation company in Norway that collects and shares anonymized data about trips people take on all forms of Norwegian public transportation for secondary purposes.

While the DGA is a positive step, it is premised on the faulty notion that private companies have little to contribute to the public good, and as such excludes for-profit organizations that do not perform activities related to data altruism “through a legally independent structure, separate from other activities it has undertaken.” This exclusion is problematic because much of the most granular, representative, and useful data needed to address social challenges often rests in the hands of for-profit companies and are collected primarily for commercial reasons.

Consider mobility data that describes people’s movements from one location to another. When aggregated and made available to researchers, this data can help elucidate and address many pressing social challenges that are fundamentally affected by patterns of human mobility and social interaction, such as disease spread, urban development, displacement, climate change, and disaster response. Population censuses and travel history surveys have historically served as sources of mobility data for researchers exploring population movements, but researchers have increasingly turned toward location data from private sector call records and mobile apps that provide location-based services to better understand human mobility patterns at high spatial resolutions, spanning wide temporal periods, and across international borders. For example, a range of researchers used granular mobility data from Facebook’s Data for Good Program during the COVID-19 pandemic for research including assessing the risk of the virus spreading out of Wuhan, China, investigating the impact of social connectedness on the spread of the virus, and informing policies that could bolster countries’ post-pandemic economic future. 

Currently however, few private firms voluntarily share the data they hold for secondary use because they face a number of challenges that limit their willingness and ability to share. These include privacy and security concerns associated with sharing sensitive data; the financial costs to de-identify data; the economic opportunity costs that may arise if competitors use their proprietary data for their own gain or the data is used in ways that harm their reputation; and most importantly, the lack of regulatory clarity on what types of data to share, who to share with, and for what purposes.

It is unclear why the DGA, which includes other provisions to support the unlocking of private sector data, excludes for-profit companies from its definition and framework for data altruism organizations. But it may stem from the Friedmanite view that the goals of profit-maximizing companies are at odds with altruism, since the latter commonly involves the giving of tangible gifts, money, or services at a cost to oneself. 

Altruistic data-sharing however, is not simply a digital articulation of traditional philanthropy. Unlike traditional donations, data is non-rivalrous. Theoretically, a company could offer its entire database for socially beneficial use while capitalizing on the same database for its own commercial interests. Indeed, in an AI-driven economy, many firms compete not by having exclusive access to data, but rather by what they do with that data. EU policymakers should therefore not assume that for-profit organizations cannot be altruistic in the data economy.

Instead, they should seek to encourage the reuse of private sector data for social good by first providing regulatory clarity for firms that want to share their data. Existing EU laws on data protection and privacy—namely the GDPR and ePrivacy directive—include exemptions for legal directives that support data sharing for the public good, but neither the DGA nor any other EU directive provides regulatory clarity for private firms that want to share data for social good. 

The Commission should consider taking a leaf out of Finland’s book, which in 2019 passed a directive that in part spells out conditions for the secondary use of private-sector health and social data, including the types of personal data which are subject to be processed, the entities to, and purposes for which, the data may be disclosed, and the processing procedures entities must follow, including measures to ensure lawful and fair processing. The directive even established the Finnish Health and Social Data Permit Authority (“Findata”), a single authority to oversee the entire data-sharing process. While this is useful for Finland, the EU needs to create a solution that is available to all member states. 

Many for-profit companies have the data the EU needs to help solve some of the most pressing social challenges it faces. Better yet, many of these companies want to share. The EU should provide a pathway to help them do that.

Image credits: Pixabay

You may also like

Show Buttons
Hide Buttons