Welcome
The United States Law Week

INSIGHT: Covid-19 Pandemic Highlights the Need for Privacy Legislation

May 29, 2020, 8:00 AM

In the search for solutions to the Covid-19 crisis, organizations across the health-care industry are looking to big data—vast, varied stores of data—to support research that responds to a myriad of challenges.

Big data is an essential tool for researchers looking not only to track and contain the spread of the virus but to manage and monitor the delivery of health care and community resources, identify individuals who may be particularly vulnerable to the virus in the midst of the pandemic, understand treatments, and develop a vaccine.

Aggregated, de-identified big data—when coupled with powerful analytics, artificial intelligence, and machine learning—can help researchers identify ways to address not just treatment and prevention. It also can assist in understanding how to manage Covid-19’s downstream societal impact on families and communities, particularly isolated elderly and people with disabilities, people in poor and underserved communities, and children who lack Internet access necessary to attend school.

The Covid-19 pandemic has highlighted the power of big data to support essential research. Critical to the ability to reap its benefits is federal privacy legislation that reflects an understanding of how big data works, where its processing does and does not raise risks to individuals and holds companies accountable for the decisions they make about its use.

With an effective, workable law in place, big data can equip researchers to understand and address not only the current health crisis, but also societal issues far into the future.

Promoting Responsible Use of Data

Unlocking the full benefit of big data will depend on Congress’s commitment to passing a privacy law that protects individuals and promotes the responsible use of data. Before the Covid-19 crisis, policymakers’ efforts toward legislation already were underway, though fractured.

The pandemic underscores the need for federal legislation that specifically takes into account the role of big data in research for legitimate business, scientific, education, social services and government use. Without an effective law, individuals will lack the confidence that their data is being used for beneficial purposes and that their privacy is respected.

Absent clear legal requirements, researchers and organizations may be reluctant to use big data to reveal new insights for beneficial purposes in an attempt to avoid legal and reputation risks. Such risk reticence can slow researchers’ work and lead them to forgo opportunities to use data to address critical problems.

Any new law should address the specific risks big data raises. But it should also distinguish big data processing from use of data that is linked to individuals and that raises greater risks to privacy. It should recognize that analyzing big data for knowledge discovery—to understand what insights data reveals—poses far fewer risks to individuals than when it is used to make decisions about them.

Technological and Contractual Measures are Key

These distinctions are only valid, however, when internal policy decisions and effective technological measures are taken by companies to de-identify big data and maintain it in aggregated form, and when clear, enforceable contractual obligations are in place that articulate how the data may and may not be used, in what instances it may be shared, and for how long it may be retained. Specific, legally binding provisions in contract and technology also are necessary to prevent attempts to re-identify the data.

Law must obligate users of big data to take such steps, and only allow greater flexibility to use big data to unlock correlations and patterns when these measures are taken. It must also impose more stringent requirements when research results are applied to people.

These controls are critical not only to protecting privacy, but also to maintaining trust. Consumers will only be willing to make their data available for these important purposes if protections are in place.

Keeping Companies Accountable

A federal privacy law must hold companies accountable for the decisions they make about the processing of big data for research. Such decisions must be based on credible assessment and mitigation of the risks raised by big data processing for research.

It will be important to include accountability-based guidance about how such risk analysis should be carried out, both to assure individuals that proper protections are in place and to give organizations needed clarity and certainty about requirements. This guidance should specifically address big data research that involves the analysis of sensitive data—such as health, genetic, or location.

This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.

Author Information

Paula Bruening is a founder and principal at Casentino Strategies LLC and former global director of privacy policy at Intel. She is also currently an Innovators Network Foundation Privacy Fellow.

To read more articles log in. To learn more about a subscription click here.