For years policymakers and privacy watchdogs have feared the power Big Tech wields with its vast collection of user data. Yet, these tech giants were given the go-ahead to spearhead the contact tracing apps that many government predominantly use to identify and notify all those who come in contact with a carrier.
At a time when this market sorely needed an effective product, Apple and Google stepped up. Together they promised their own contact tracing API. With the companies behind iOS and Android, they can control the movement data of millions of people. Here’s what they had to say:
Through close cooperation and collaboration with developers, governments and public health providers, we hope to harness the power of technology to help countries around the world slow the spread of COVID-19 and accelerate the return of everyday life.Apple and Google’s joint statement
Despite the avalanche of services, however, we know very little about contact tracing apps or how they could affect society. What data will they collect, and who is it shared with? How will that information be used in the future? Are there policies in place to prevent abuse?
While it’s of utmost importance to deliver effective contact tracing apps, being an especially necessary tool in curbing the spread of an airborne virus like COVID-19, there is also a very real need to address these privacy concerns.
Under the hood
Before we get into dissecting the potential privacy concerns, I just wanted to quickly break down the different underlying technology that these apps use.
- Location: Using GPS or triangulation from nearby cell towers these apps identify if you’ve has come in contact with by tracking the phone’s movement and comparing it to other phones that have spent time in the same location.
- Bluetooth: Phones swap encrypted tokens with any other nearby phones over Bluetooth and essentially use the idea of proximity tracking to identify if you’ve come in contact with someone. This is generally considered better for privacy as it is easier to anonymize user data. If you think about using GPS as storing absolute addresses, Bluetooth is essentially uses relative addresses thus ensuring a great layer of anonymity.
- DP-3T: Similar to how Bluetooth tracking works but the only difference is that an individual phone’s contact logs are only stored locally, thus leading to its name- decentralized privacy-preserving proximity tracing. Here, there is greatest care taken to address privacy concerns however there is some tradeoff to creating an effective app.
What Google & Apple’s joint API offers is only a set of underlying protocols inside Android and iOS. In other words, Apple and Google have done the groundwork, making sure that health apps can talk to each other across Android and iOS and get access to the features they need. It’s now up to countries to develop the apps that plug into these foundations and provide the actual front-end interface for users. A crucial part of this underlying framework is access to Bluetooth signals. As mentioned before, the validation of tokens used to trace people who come in contact with those with the virus is generally seen as a pro-privacy move.
If you want a more detailed breakdown of the different contact tracing apps, check this tracker.
If you look at how these technologies function, you can see that as the app begin to address privacy concerns there are increasingly more trade-offs to be made in the effectiveness of the app itself. Understanding which underlying technology is being used will help us better understand what data is being collected and who it is being shared with.
Apple and Google have also said that access will be granted only to public health authorities, and that their apps must meet specific criteria around privacy, security, and data control to qualify. This has meant that some countries and states have decided to go their own way; usually unhappy with the restraints of non-centralized storage.
Even though some may call this a misuse of influence, only limiting certain players into the market, I believe that such a restriction is a positive move. Creating such credible technology but ensuring that people only utilize it in a privacy-preserving manner shows responsible deployment of potentially misusable tech. I think this move deserves more praise than it’s getting.
Why it’s significant
In the past, most significant inventions like the computer and the internet were a result of public sector involvement: funding and personnel. Governments and publicly-funded bodies had a huge role to play in the development of such valuable ideas. Specifically, a lot of these innovations were catalyzed by national emergencies (such as COVID-19). However, as of late, there has been a heavy reliance on the private sector to deliver these innovations. This shift in dependency for inventions and innovations towards the private sector is not only somewhat novel but marks an acknowledgement of the importance of the private sector.
Other than the acknowledgment, for many of these private tech companies there is an ulterior motive- accumulation of data. This bring us to our next concern: How will that information be used in the future?
During past national emergencies, there is substantial evidence of governmental overreach, often disregarding the rights its citizens are granted. Such times are used as an opportunity to fast-track legislation that advocates for aforementioned overreach. For example, soon after 9/11 the Patriot Act was passed. It was a heavily criticized bill that allowed for tapping of domestic and international phones, increased surveillance of citizens and other similar measures that privacy watchdogs called an “erosion” of citizen’s rights.
So the idea that the powers-that-be may misuse influence during times of national emergency to profit off of it in the future isn’t surprising. It’s just that the potential perpetrator is different. While with privately-made contact tracing apps we may have avoided governmental oversight, but what private company oversight? Either party can do damage with movement data and in the past governments have misused this power not just generally but also in this specific context (of movement data) too. The upcoming challenge will be to limit private companies from extorting their users the same way citizens of oppressive countries have been extorted. Knowing how the data is going to be handled in the future is going to be of utmost importance here if we want to prevent companies misusing the data that they are currently collecting.
Here I think there’s a larger question to be asked: Who’s hands are you more comfortable giving ownership of your data to? The government or private companies? While privacy experts would argue that there is a workable solution where you own your data, given current time restrictions I don’t believe this is currently a practical alternative, though easily the best one. There’s a certain element of weighing out different political and profit intentions when deciding in who is going to own your data. This brings us back to who we trust more to handle responses to national emergencies: Big Brother or Big Tech?
To understand what guardrails are there to prevent misuse of data, I think it’s important to understand how different countries are using contact tracing apps. Also I think it’s important to note that Apple and Google are issuing blanket terms- applicable globally- while different governments, with their own set of varying political intentions, are trying to bargain for different terms at a national level.
A sticking point between the tech giants and nations’ representatives is who will process the exposure notifications. Apple and Google want to process the notifications on users’ phones without storing them on a central server, to preserve the maximum degree of privacy possible but many nation’s representatives argue that storing this data in centralized servers allows for better analysis of the data helping them identify additional exposures quicker thus more rapidly containing the spread of the virus. This is quite an interesting dilemma: corporations trying to do right by their users versus countries trying to do right by their citizens.
When countries decide to go against the grain, developing individual apps, there’s a mixed bag of results.
Initially committed to creating an independent app, this exposure notification app hasn’t seen great success. This essentially boils down to one key idea: not even a UK-government approved body have the influence to override permission on an OS level.
Generally, constantly broadcast of Bluetooth signals is frowned upon and even restricted by Android and iOS because it can be used for targeted advertising. However if Google and Apple are creating an API, they can control the operating system to allow only their solution to have the permission to constantly broadcast. This by design gives them an inherent technical advantage as UK’s app won’t be granted such a permission.
As a result of mounting criticism that the app isn’t effective, and given EU’s reluctance to work with Google and Apple on this project, UK has had to consider shifting to a centralized storage based system, which again will be criticized for being more vulnerable to privacy loss.
Australia’s COVIDSafe app also suffers from similar performance issues as the OS restricts how constantly one can relay Bluetooth signals.
This country was praised for their handling of the virus outbreak and this can be attributed to how contact tracing was handled.
The government used a combination of different tools like cellphone-location data, CCTV footage, and credit-card records to broadly monitor citizens’ activity. When someone tested positive, like a flood warning, almost all available data was broadcasted online. These victims were put on blast with their last name, sex, age, district of residence, and credit-card history, with a minute-to-minute record of their comings and goings from various local businesses all revealed. The general ideology adopted by countries like South Korea, Singapore and China is that the wider social good trumps privacy and that this calls for overreach.
With a look at just these two countries you can see a micro chasm of the world. There are generally two schools of thought which essentially means- do you sacrifice privacy for effectiveness of the app? Whether or not you may agree with South Korea’s approach, evidence does suggest that this (and countries following similar strategies) see more effective dividends in containing the virus when allowing for such overreach. Ultimately, care must be taken that such governmental overreach is only for the time being and not an opportunity to enforce a long-term authoritarian regime.
Google and Apple have designed a well-intentioned framework. They have done a good job in acknowledging the intrusive power of the tool and the potential for malicious abuse of it. However in doing so, they have also promoted creating an app that evidence suggests won’t be just as effective as government would prefer.
Big Tech has been put in an unfortunate position. Countries which often criticize them for ethical lapses are also asking for the same companies to lower their privacy standards. They’re in a lose-lose position. Other than a credibility-building exercise, these companies don’t have much to gain from helping out with contact tracing apart from obviously helping limit the spread of the virus.
While drawing the line between privacy and effectiveness (essentially extent of overreach) will be an iterative process that involves cooperation from both parties, I am more concerned with two larger questions that this situation begs to ask. One, how far are we personally comfortable sacrificing our rights and liberties in times of crisis for the greater social good? And two, for how long will democratic countries allow technology companies to dictate the terms?