Why the anti-appers?
Michael Kende argues that the lack of digital trust in contact tracing apps which could help control the pandemic, save lives, and normalise our societies is a major wake-up call.
Search the site
The imminent arrival of the COVID-19 vaccines has highlighted the long-running concerns of those opposed to vaccines, sometimes known as ‘anti-vaxxers’. But COVID-19 already had a tool to fight the spread of the pandemic – the contact tracing application. It is a simple, painless way to help fight transmission, by notifying someone if they have been in proximity with someone who was infected. However, usage is low, and a primary reason is concerns over privacy and data protection.
Recently the Graduate Institute, Geneva, and the EPFL with its Center for Digital Trust co-hosted the Data 2025 conference, which shed light on the broader issues facing the contact tracing apps. The apps gather data about us, and our confidence about what happens with our data has been shaken by a growing drumbeat of bad news – in short, there is a lack of digital trust. As Martin Vetterli, President of the EPFL, noted in his conference keynote, “digital trust is when we feel we can trust the digital world as much as we do the real world”.
In the real world, contact tracing is not new in fighting a pandemic. Traditionally, it has been done manually, by painstakingly contacting everyone who the sick person remembers being near while contagious. However, given the exponential spread of COVID-19, manual tracing was soon overwhelmed. Smartphones offer an excellent means to automate contact tracing, because they know where they have been, can identify other users in close proximity, and, unlike us, have a perfect memory. However, those features quickly raised privacy concerns over who would receive this data, and what they might do with it.
The team that developed the SwissCOVID App, including EPFL professors and others, developed a privacy-protecting app using the Exposure Notification feature developed by Apple and Google for phones using their operating system. The app identifies us with a random ID, and uses Bluetooth to keep track of all the random IDs of users who were close to us for at least 15 minutes. When someone inputs into their app that they were diagnosed, everyone who has been in proximity is automatically notified. The app does not know our name, keep track of our location, or provide any personal information to a central authority.
This level of privacy engineering does not appear to be sufficient, however, because only 22% of the population in Switzerland is using it because of a lack of digital trust about what would happen to our data. Data is unique, and while many analogies have been proposed – oil, nuclear waste – none is perfect. We cannot see it, we cannot know where it is, we do not know when it is used and for what, and we can never know if it is truly deleted. The solution is data governance designed to create digital trust.
Data governance must be developed with a multi-stakeholder approach – no one party can develop or ensure it. The underlying technology is clearly critical as it determines at a base level what data is known. However, while the SwissCOVID App and others were engineered to ensure privacy, we have seen that is a necessary feature to protect privacy, it is not sufficient to generate digital trust.
What about governments? Governments can set laws that protect our data. The recent EU General Data Protection Regulation (GDPR) is widely considered to be state of the art – so much so that other countries are implementing new laws that are similar to GDPR. But so far, government action is also not sufficient to develop digital trust.
Companies also play a leading role. Companies provide access to the internet, devices, and the services that collect our data. Trust is quicker to break than it is to repair, and companies have certainly contributed to the lack of digital trust today. Adhering to regulations is a necessary baseline. However, this has not been sufficient to create trust, and some companies are going further. Microsoft, for instance, is extending the privacy rights under GDPR to its customers globally.
Last but not least, civil society organisations such as the Internet Society can help to represent the interests of users, and promote ethical data handling practices and privacy-protecting technologies. They can help to advocate for government regulations, and company action. They can also play a critical role in educating users and developing digital trust where trust is deserved.
Every day, most of us are sharing our location information with dozens of companies, unwittingly and often with no direct return, but we balk at doing so for COVID apps. The lack of digital trust in a privacy-protecting tool that could help control the pandemic, save lives, and help to normalise our societies is a major wake-up call. A holistic approach toward data governance to build digital trust is critical for the future in 2025 and beyond. It will impact uptake of new services in healthcare, insurance, finance, and ones we may only be able to conceive of when digital trust is improved.