Data Privacy and Contact Tracing

Written by Jeffery Smith, Director of the Center for Business Ethics & Eva Sedgwick, Associate Professor of Business Law
May 14, 2020

When we think of COVID-19 we are immediately drawn to certain places—Wuhan, Lombardy, Tehran and New York City. But lesser known places are beginning to receive attention.

Take North Dakota. Douglas Burgum, the state’s governor, has spearheaded an effort with the state’s public health agency to use a mobile app that efficiently identifies, traces and contacts individuals who have been exposed to the Corona virus. This app not only assists particular people who need to know when to get tested and self-quarantine, but it identifies areas of concern that may be “hotpots” for transmission.

Google and Apple have developed similar technology within their respective mobile operating systems. Those systems aim to gather information about individuals who have tested positive for the virus, identify the other people with whom infected individuals have come in to contact, and provide notification and medical guidance to anyone who may have been exposed. Both companies have noted, however, that unlike North Dakota’s app, they will not track users’ location.

At the heart of this technology is the acquisition of geospatial data about individuals as they move within the community. The technology not only locates individuals in time and space but uses Bluetooth signals to send a certificate to other proximate mobile devices. Those technological “handshakes” between devices can be logged, stored and used when any one mobile device user reports that they have tested positive. Those with prior contact with this user are immediately notified on their mobile phone.

Countries that have effectively managed the COVID-19 pandemic have found this technology indispensable. Singapore, India, Norway and Taiwan provide some interesting examples. China, of course, has deployed mobile technology even more aggressively.

Civil libertarians have expressed a number of concerns with mobile contact tracing. They center on two related issues. First, will individuals have full knowledge about how their location data is being gathered, stored and used, and will they be afforded an opportunity to control or withhold that data? Second, since the pressure to roll out the tracing technology is immense, does the rush toward implementation risk new forms of surveillance by companies or government agencies? Individual control of data and surveillance are the two axes along which tracing technology raises a number of privacy objections.

Individual Control of Data

Control over our geospatial data is not just important to our autonomy, but it reveals part of our identity. This is true even if—as promised by Apple and Google—it is anonymized and not associated with our legal names, residential addresses or other online accounts.

Close up of two phonesA recent report by the American Civil Liberties (ACLU) discusses the degree to which individual liberty can be protected using mobile-based tracing technology. It argues that the choice to download and use the technology is only one consideration. There are a number of other questions to ask. Will the platform or app have clearly identified terms and conditions in plain language? Can users decide when to “turn on” the technology, and when to “turn it off”? Is there an option for those who report a positive test to upload their location data? Can that data be selectively redacted? Is there an option to select with what agencies or health care companies receive location data and testing reports? This list goes on.

There is a large zone of discretion by app developers, technology firms and government agencies as to how these questions will be addressed. But it needs to be stressed that the right to control one’s geospatial data is not an all or nothing value. We readily admit that there may be justifiable limitations to it and there are obligations to participate in efforts to contain the spread of the virus.

One example of how control over one’s personal data may be justifiably limited by this social obligation is how and by whom one’s data is used. It is easy to imagine that timely, anonymized geospatial data will need to be shared with multiple actors in a coordinated fashion. Allowing individuals to decide which hospital or health authority can “see” their data may run counter to the most important goals of the technology. While an “opt in” design seems like basic starting point, there are other design elements that may represent a suitable tradeoff between individual control and public health. Indeed this is why states like North Dakota and countries like Germany have made choices that allow the use of mobile technology that involves gathering users’ location data.

Protecting Privacy

The technological needle to thread is to reduce individual control of personal data in limited ways but still provide proper checks on how companies and government agencies gather and use data.

First, if we are concerned about reducing individual control, a number of design elements can limit the scope of the technology’s operation. Perhaps the platform could automatically cease operation after a certain period of time and data logs permanently deleted. Assurances can be given and audited that restrict the technology’s use to a particular virus. Technology firms and agencies using the data can retroactively make disclosures to provide verification that the data was anonymized and used only by parties identified in the platform’s disclosures. Apple and Google appear to be recognizing some of these possible limits in scope.

Second, if what justifies reducing individual control of data is public health, then it stands to reason that medical and public health officials should be closely involved in the creation and implementation of the technology. This way the relevant stakeholders are involved and the tendency for commercial interests to supersede the public health objectives of the technology can be mitigated.

This call recognizes a guiding principle in the ethical design of any technology. Responsible uses of technology are more likely assured when it remains a tool to support human endeavors. Granting scientists a seat at the table will help prevent the technology from becoming an end in itself—perhaps applied in other nefarious ways—as opposed to an instrument serving the common good. It also increases the chances that the technology operates only in ways necessary for effectively tracing the virus and containing its spread.

Both of these steps, while not exhaustive of what can be done by technology firms, underscore that our right to privacy is not simply an unqualified right to be left alone. Privacy is context-sensitive. Whether it is appropriate that someone (or some organization) comes to possess our personal information depends on our relationship. Our expectations about what our family physicians may legitimately know about us are unique to the patient-doctor relationship. The zone and expectation of privacy around personal information in that context would not carry over into our relationships with friends, businesses or the government.

A responsibly designed tracing platform therefore needs to understand what geospatial data is relevant to acquire and use, knowing that individuals, public health agencies and technology firms stand in particular relationships with one another. By maintaining a clear and transparent “opt in” design, limiting the scope of the technology’s operation to anonymized geospatial data over a specific time and disease, verifying compliance with these limitations, and by including the perspective of disease specialists in the creation of the platform, we can begin to see how certain pieces of personal information can be legitimately possessed by certain parties when tracing technology is used.

Critics of the technology will rightly point out that these steps rely on the trustworthiness of the technology firms designing the technology. This is perhaps where the past business practices of companies like Google and Apple will be an important factor in the public’s willingness to participate in mobile tracing.

Eva Sedgwick is an Associate Professor of Law at Seattle University.

Share to Facebook or Share to Twitter