Interview: Privacy, data protection, and the digital single market ({{commentsTotal}})

EU data protection supervisor Wojciech Wiewiórowski.
EU data protection supervisor Wojciech Wiewiórowski. Source: (Youtube)

EU data protection supervisor Wojciech Wiewiórowski thinks that we should approach the issue of the digital single market and a European digital society neither rashly nor timidly: Without being afraid of data, but with enough reluctance to keep the very real consequences in mind.

One of the priorities of the Estonian EU council presidency is the free movement of data as the so-called Fifth Freedom after the free movement of goods, capital, services, and persons. A conference in Tallinn on Monday and Tuesday discusses this in the context of the introduction of the Digital Single Market.

ERR News had the opportunity to talk to European Data Protection Supervisor Wojciech Wiewiórowski, who has first-hand experience with the difficulties of reconciling the call for a digital single market with the ever greater issue of data protection and privacy.

ERR News: Mr. Wiewiórowski, in terms of data protection, what are the necessary minimal guarantees for the digital single market?

Wiewiórowski: The new data protection framework that has been established over the last few years by the EU legislators tries to a kind of golden standard of data protection at the time the digital single market is introduced. It’s hard to say what the minimal standard should be, it’s better to say what the optimal standard should be.

The optimal situation is where the real controller of the data about a person is that person themselves, while at the same time the data that isn’t associated with that person is able to travel across borders without limits. Of course it’s hard to distinguish, most machine-generated data is understood to be free of personal connections.

But let’s take the connected car, for example. We theoretically assess the machine, the car, but at the same time we understand that when you put a black box into a car, we will be able to assess the driver and probably also the passenger as well. For example we’ll be able to assess their behavior for insurance purposes, or for liability, or even criminal responsibility. There’s nothing wrong with that, that’s something that can be done, but it shows that actually the data is personal when the context is given.

Are there any specific legal or practical obstacles that need to be overcome?

I would look at this from the point of view of interoperability. You have different different levels of bottlenecks, different levels of obstacles. The first are technical, which means that data can flow across borders only if there’s an infrastructure for that, if this infrastructure is accessible. Then there’s the organizational part, which means that we have to know where the data is, who is controlling it, how to deal with that. Then we also have a semantic level, which means we have to understand what the data is, understand the languages, which is a problem of the digital market, but also the different formats that are used for different kinds of data.

Beyond that there are two more layers we have to think about. One is the political one, and I guess this has been overcome with the broad political agreement in Europe that this fifth freedom is needed in the European Union. And then also the legal one, which consists of matters of cyber security and also the fight against cyber crimes.

But of course as the data protection commissioner I have to say that the rules dealing with privacy online, meaning not only data protection, but also the protection of the confidentiality of information, the confidentiality of communication between people, is one of the things that needs to be taken into consideration.

How much of all this is addressed in the EU’s rules announced for May 2018?

Well I’d love to say we’ve solved all the problems, and that on from May 25 everything will be going perfectly well, but this would be possible only if the world didn’t develop any further. And of course it is developing. It’s about keeping up, it’s about being ready to observe if something we’re preparing right now is efficient, and if it offers security for the fundamental rights we want to secure.

How are different attitudes to data security and data protection a problem? Estonia is very open, but this changes if we move towards e.g. Germany.

Oh, I would say that there are definitely cultural differences! A very good example I think is the situation concerning tax data. Tax data is very easily accessible in Scandinavian countries, you can check what the income of your neighbor was and the tax he paid last year.

At the same time this would be heresy in many countries of the European Union. The same goes for the personal identification number, which exists in many countries and allows you to identify a person throughout different databases, Belgium has it, and my home country, Poland, has it as well.

But that’s heresy in Germany, where there’s still the memory of people being tattooed with identification numbers in Nazi times. So there are differences. But even if you think about Estonia, of course it’s very open, but that doesn’t mean that Estonian citizens want all the data about them to be accessible to anyone.

This isn’t the Brave New World of Aldous Huxley, where everybody belongs to everybody. You have different attitudes to which data you give to your government, and which data you give to another, even if it’s that of a neighboring country. Then you have different situations, e.g. when you deal with medical data, or financial data, or even data about the education of your children as compared to data about your shopping.

So, do we dive in, or should we be very cautious?

I would take the motto of my home town, Gdańsk: Nec Temere, Nec Timide, neither rashly nor timidly. We shouldn’t be afraid of data, we shouldn’t be afraid of the digital single market, also as far as data protection is concerned, but at the same time we shouldn’t be the type of enthusiasts that take every kind of possibility that exists and try to use them without thinking about the real consequences.

Editor: Dario Cavegn