We need privacy and data laws to tackle this world pandemic
Governments are increasingly using digital technologies and big data analytics to address the Covid-19 pandemic. These technologies can’t replace other comprehensive measures, and their adoption must consider the risks they pose of further entrenching surveillance and eroding the rule of law.
Life as we know it came to a halt a few weeks ago. Digital technologies, however, have played a crucial role in keeping many of us afloat, with social and work or study interactions online, and gig-work or delivery services to keep people working and businesses going. Many of us are increasingly using messaging apps to check on distant loved ones.
Besides keeping us connected, digital technologies are also being used to tackle the pandemic crisis. Apps and big data analytics are tracking citizen movements to identify and prevent transmission between known cases and the people they interact with. In addition, this technology can help to suppress silent transmission by reducing contact between individuals: China developed and deployed apps where people could log their health status and check if they had been in close contact with an infected person, allowing the government to follow the spread of the disease and to issue authorizations for individuals to circulate in public. Israel’s prime minister recently used his emergency powers to tap into cellphone data—previously used as a counter-terrorism measure—to follow the virus. South Korea developed an app to supervise people under quarantine. Taiwan, early on, integrated its national health insurance database with its immigration and customs database to identify travelers that could be bringing the virus to the country.
The US is also adopting technological solutions to keep track of the pandemic. The government has started talks with firms like Google, Facebook, Palantir and Clearview AI “...about how they can use location data gleaned from Americans’ phones to combat the novel coronavirus.” Volunteers have also built “Coronapps” that would allow users to see if they have crossed paths with an infected person, encouraging self-isolation and self-monitoring. According to the Wall Street Journal, government officials are already using location data from cellphones to understand the movements of Americans and how they may be affecting the spread of the disease.
But scholars and activists working on privacy have raised concerns, especially when the technology relies on de-identifiable data: these apps may reveal personal details of patients’ lives, leading to speculation and, at times, stigmatization over activities and locations visited. If not widely used, these apps may create a false sense of safety, and if widely adopted they may create a collective panic that could bring its own harmful effects. There are also concerns about this data being fed back to other government agencies, and being used later, or in the meantime, for other non-pandemic related uses. Yet, the current consensus seems to be that aggregated mobility data poses no privacy risk.
Government officials are already using location data from cellphones to understand the spread of the disease.
In times of crisis, civil and individual interests and rights are often limited to accommodate measures that intend to advance public interest goals. Measures like curfews, mandatory closing of businesses and travel restrictions fall within these lines. Limiting individual privacy laws could also fall within these lines. These limitations however should be evaluated carefully, taking into special account their potential effectiveness, their cost, and whether similar results could be achieved with less privacy rights-invasive measures. Broad and vague surveillance authority would seriously compromise privacy rights and, in many constitutional democracies, the rule of law. In Israel, for example, the much more important action to flatten the curve was to limit leaving home to essential purposes, for which increasing surveillance was not necessary.
Lastly, a key question to keep in mind is that technological solutions that are rolled out are hard to roll back: much like what happened after 9/11 and the war on terror, once we’ve managed to deal with this pandemic, we will be scared of a new one and the controls will stay. There's a chance we’ll leave the apparatus in place to prevent the next one, and then we’ll get used to it. Additionally, of course, many new companies will have made this part of their business model.
Limitations should take into special account whether similar results could be achieved with less privacy rights-invasive measures.
It may be that in many places, large scale stay-at-home orders will be most suitable to contain the pathogen, as various European countries and many US cities and states are mandating. This is not to say governments should not use personal information to tackle the emergency nor collaborate with the private sector at all. We are most likely going to need technology to get us out of this one. Indeed, the European Commission ordered telecom companies to share aggregated mobility data to follow the spread of the disease and monitor the compliance with stay-at-home orders. Independent European countries have also taken data-related measures to address the crisis: Germany, for example, inserted wording into its GDPR enabling legislation that specifically allows for the processing of personal data in the event of an epidemic, and Italy passed emergency legislation requiring anyone who has recently stayed in an at-risk area to notify health authorities either directly or through their doctor. Similarly, as countries evaluate ways to allow healthy and immune people to go back to work, mobility permit applications that evaluate and prove the risk of a particular individual and wide spread testing contact-tracing measures may be useful and even necessary to ease the current restrictions on people’s right to move and work.
My point is, rather, that as governments decide that they will adopt data-enabled solutions—for example, to have a targeted approach to address the likely subsequent waves of the disease or issue mobility permits—these solutions need to be implemented carefully. The measures governments take must be necessary and proportionate to address the crisis. To safeguard our liberties and the rule of law, they must also be implemented hand by hand with strict rules about how the information collected by these solutions can be combined with other data, and limits about how, by whom, and for how long it can be used. For example, policies that further limit personal privacy should be exclusively limited to address this pandemic. They should be limited for particular objectives that are very hard to achieve by any other means. Consent should not be the main privacy-protecting mechanism, and the limits about how the data can be used should not be “waivable”. Along somewhat similar lines, the European Data Protection Board has published a statement on the use of personal data to address the pandemic and the Electronic Frontier Foundation also issued some guidelines on how data collection would be permissible to protect public health and privacy. As some governments try to push emergency decrees further than what seems necessary and proportionate, as Hungary’s Orban just did. It will be an important task for human rights advocates, civil society, and courts to remain vigilant.
There is a good chance this will be a long, bumpy ride. Yet now is the time to think about what we might want to change for when we go back, slowly, to our daily lives, and of course, what the role of technology in that future should be. It doesn’t need to be an enhanced surveillance future. Maybe, as we see pollution levels drop, we’ll learn that we can travel less and use video conferences more. Maybe some cultural materials or textbooks could be free and open for all, always. And maybe, hopefully, we’ll finally come up with better personal data protection laws.
A previous version of this article was published on the Berkman Klein Center for Internet and Society’s Medium Collection on March 18, 2020 and can be found here.
Beatriz Botero Arcila is a fellow at the Berkman Klein Center for Internet and Society at Harvard University and doctoral candidate at Harvard Law School.