To start with, a story and a quiz.

We all know this kind of story: For several years, before every taxi order I was asked by a friendly voice to use a taxi app. When even a taxi driver gave me a recommendation card, I decided to give it a try.

The result was stunning. Okay, I had to enter my data first. But, after just one minute of the most pleasant user experience, I had not only ordered a taxi – I also saw who the driver was and from where he was approaching my location in real time. He knew who I was, where I wanted to go and that he should please wait by the road and neither ring the bell nor drive away. Needless to say, you can also pay cashless with the app, including tips if you wish. A killer app. I will never hear the waiting music of the taxi hotline again … Where should this thing have a catch?

The catch on the story

– the Netflix variant so to speak – looks like this: XY sits in a prison cell with abrasions on his face and hears the taxi driver being harassed next door. Because his democratically elected president persecutes the followers of his former fellow, a certain Mr Gülen, with religious rage. Because he had the misfortune to live in Turkey and in a stupid way to be suspected of being a special supporter of Mr Gülen, on his way to an important appointment. And taxi drivers are often interesting conversation partners.

You may replace “Turkey” with China, Russia, the USA and “Gülen” with the Dalai Lama, Pussy Riot or Edward Snowden at will. Sounds unrealistic? You know, as a young man, I myself was once suspected of supporting terrorists – which was not the case. Although in 1985 this had nothing to do with data, it was – even in the orderly democratic conditions of the Federal Republic of Germany – a rather impressing experience.

It’s true, the violation in this story lies not in the collection of data. It lies in the wrongful detention and the use of violence. And why should you worry, if you have nothing to hide? Well, some people do have things to hide for good reasons, reasons you would support. And at least you too have good reasons not to share your data with people you don’t know.

This outcome is hopefully an unrealistic dystopia. Because fortunately there is data protection. Fortunately, I can assume that the data from my apps – from all apps of all people – will not be evaluated in one place by AI and combined into a transparent model of all our activities and interests. That nobody will ever have the power and the immense blackmail potential to know everything about everyone. Or do they?

Now here is the quiz:

„We kill people based on metadata.“

– whow said it?
A: Muammar al-Gaddafi
B: Xi Jinping
C: Michael Hayden

It was of course the latter, director of the NSA until 2005, in a remarkably frank statement. In China such a statement would be just as true, but rather unlikely. And Arab despots in Gaddafi’s era already had shredders for corpses, but not the sophisticated monitoring instruments that they had only bought from European and US companies in recent years.

I admit, this all sounds drastic. What I want to do here is to take up the cudgels for the very fundamental right to determine how our data – more precisely: the data we generate – are to be handled. For mechanisms that protect us from an unknowingly full take. Because neither hackers nor IT companies nor states always want our best. And a societies also live from opposition if they want to sustain in the long way.

Human rights don’t change everything.

This is just one example of fundamental digital rights, one that is particularly important to me. Others deal with the exclusion of people who do not have access to IT, with the security of democratic elections, etc.

If we start to discuss them and one day decide on them, we set standards by which states and governments must be measured in the future. That’s why this topic is so important to me and that’s why I’m doing this blog to promote it.

<<