New apps are being developed in an attempt to get the Covid-19 pandemic under control and restore a halfway normal life. Sewn with a hot needle, these apps could become a disaster. The good news is: Trustworthy solutions are in sight and might serve as a blueprint in the future.

In an automatic car, when you stand at a traffic light and take your foot off the brake, the car starts to roll forward slowly. States seem to have such a built-in drive when it comes to “safety”. If you don’t keep your foot on the brake all the time, the state wants “more safety”. And it doesn’t even have a reverse gear.

This is also the case with the digital protection measures against the Covid-19 pandemic. Here, smartphone apps should help in various ways. One scenario for a Corona app is that it uses Bluetooth to document all the closer contacts one has with familiar or unknown people. (A radio cell interrogation would not be accurate enough for this.) If persons fall ill, their chain of infection can be informed and further spread can be prevented. Other solutions focus on data donations, self-observation, test results, test certificates as admission tickets, or anonymous movement profiles to assess general mobility.


Civil rights by design

Open questions arise quickly:

  • How does anonymisation work and how does notification work in case of need?
  • Is there a central server that collects all information?
  • In which form does such a message reach me – that after all will demand a drastic restriction for two or three weeks?
  • Who else learns about this message?
  • How can false positives and trolls be eliminated without leaving everything to the authorities?
  • Do people have disadvantages who don’t want to install the app or who do not own a smartphone?
  • Can one trust in not being monitored, even as a potential “threat”?
  • Which data is collected and transmitted in the background?
  • Do people behave more carefree when they have the app, and is this desirable?

… and so on.

Including such considerations in the concept is not a civil rights pirouette, not a formal exercise, but is crucial for the acceptance and success of such a measure. Many people are very disciplined, but equally disturbed by the current restriction of their freedoms. They trust that a nationwide contact ban will, of course, not be permanent. And they don’t want see this enforced with a sledge hammer. Turning the mobile phone into an electronic anklet would of course be disproportionate – because Covid-19 is not Ebola and the people involved are not criminals.

The scepticism is justified

Times of crisis enjoy a perfidious popularity among some politicians. Because in such times there is a leap of faith for the state and the government — which makes sense in a way. They immediately pull out plans that would have no chance of being implemented in normal times. Interior ministers typically have such a wish list of surveillance methods: zombie measures that appear undead again and again, such as data retention, clear name obligation in social networks or face recognition in public spaces. The “approach to solve social problems with surveillance” is considered one of the 12 criteria for surveillance states* (*-marked links are in German).

But in such a situation, even a well-meaning and solution-oriented politician will make proposals that contain strong elements of control. Initiatives from the business community are warmly welcomed. As the pharmaceutical industry or the Internet giants also benefit from the leap of trust – with which they can act in a helpful manner, but of course they will also do their own thing. So if you take a closer look at what is coming up in the case of corona tracking, then sobriety is the rule of the day. In the past every trust in a moderate digital security policy was systematically gambled away. Measures once introduced in times of crisis* were retained practically without exception.

The Asian role model?

The romance with the totalitarian state is reflected in the myth that China, with its surveillance and intervention capabilities, would be better prepared to fight an epidemic. This is nonsense. South Korea and Taiwan have responded far better than China. As I have recently described*, dictatorships typically follow the same pattern based on vanity and chauvinism: First denial, then false assertions of being in control, then ruthless measures and finally propagandistic containment. Nobody knows whether the current figures from China can be trusted. But one can assume that the Chinese government has the local media at least as much “under control” as the pandemic.

In China, Hong Kong, South Korea, Taiwan and Singapore, apps came into use that would not find the slightest approval in Europe. They report the locations of infected persons directly to the police. With the exception of South Korea, the participation was not voluntarily. Leaving the quarantine was answered with a warning and sometimes extreme penalties. The apparently broad acceptance of these measures in Asia may have many reasons: A sense of urgency, a different relationship to individuality and trust in the state — but certainly not a lack of desire for civil rights.

Anyone who cites such a solution as a model (and assumes the broadest possible participation) builds on a credibility that has been systematically eroded in recent years. It is simply not enough to speak the words “in compliance with data protection” into a microphone. Even just talking about app development with an NSA-related company like Palantir was an incredibly stupid mistake. Credibility, a precious commodity in times of crisis, is currently enjoyed by the EFF, CCC, Reporters Without Borders, Privacy International or maybe Data Protection Commissioners, but not by the Governments..

Criteria — and a solution

So what would be the criteria for a Corona app that complies with data protection regulations?

  • Data collection must be minimized — this is about contacts, not geotracking.
  • The data must remain on the mobile phones. There must be no automatic reporting to authorities.
  • There must be no geo-tracking of positive cases.
  • The software should be open source, in no case it should be a black box with possible backdoors.
  • An alarm cannot be triggered by any user.
  • The data must be deleted after a reasonable time (incubation period).
  • And, of course, there must be no obligation to install the app or carry a mobile phone.

Similar criteria have been published by the CCC here*.

Most of these criteria seem to be met by the current European DP3T approach, which is being developed by a donation-funded NGO to provide an interoperable platform for different designs. If it is properly disseminated (about 30% or more), it can contribute to easing the current shutdown and only proceed it in a more targeted manner. The protective effect is thus a social one: Since everyone has an interest in easing the situation, the installation of a well elaborated app could be just as popular as the #StayAtHome campaign, which made a curfew unnecessary in many places. So we are dealing with a voluntary counter model to the authoritarian state.

The discussion could accordingly become an asset: Under urgent conditions, it quickly brought about an awareness and clear criteria under which to operate in crises in Europe. This does not require media brainwashing, as conspiracy theorists claim, but transparency and a withdrawal of security policy and economic reflexes — in favor of the common good.

All this says nothing about the effectiveness of the apps. They will only make a relative contribution, such as wearing masks or keeping your distance. But we have seen: It’s not a question of absolute control or the one solution, but of dealing with the pandemic in an appropriate way that takes into account various aspects of the common good. For this relatively higher level of health security, people are then prepared to accept a relatively lower level of security (e.g. due to Bluetooth and a limited level of trust) for a while.


Note from Apr 19, 2020:
At the time of publication of this article, instead of DP3T I mentioned PEPP-PT as the promising approach — a platform development which originally included DP3T, but which has withdrawn from its consistent data protection implementation. You can read more about the background and technical details here. About the motives of this decision one can only speculate at the moment.