Mid February 2021 Brussels took a step forward in the process to modernise the existing rules on cookies, spam and the confidentiality of digital communications. After more than 2.5 years of haggling, the representatives of the EU Member States have reached a compromise on the new ePrivacy Regulation. But this does not yet mean that there is a new law in Europe. The telecom ministers (in the Netherlands the Minister of Economic Affairs) have very different ideas than the European Parliament and the European Commission, which published the ePrivacy proposal in January 2017. The three institutions are now entering the so-called trialogue, to produce a new text on which the three of them agree. These will be difficult discussions. The positions of the Council and Parliament could hardly be further apart.
The Council is almost back to where it started in 2019: diametrically opposed to the European Parliament. The five biggest disagreements are: (i) the possibility of further processing of traffic and location data, (ii) re-use of all data you can collect from end-users' devices, (iii) use of a tracking wall to enforce consent (iv) privacy by default and (v) (re)introduction of a general traffic data retention obligation.
The Council wants to allow further processing of traffic and location data (the communications metadata), in Article 6c. This means that the communication provider, also a Big Tech provider from Silicon Valley, may decide for itself for which other, compatible purposes it may use the data. Usually, the original collection purposes are rather limited; to make the communication technically possible, to provide a well-functioning, secure service, and to send bills based on consumption.
The Council proposes that communication providers may use the metadata, for example, to map traffic flows, create heatmaps and statistics or for (commercial) research. All this is allowed without consent, as long as the providers anonymise the data, or if that is not possible, pseudonymise them. Anonymisation means irreversibly deleting all identifiable data, while pseudonymisation means replacing directly identifiable data such as names with numbers. There is a huge difference in privacy impact between the two, but in practice this leads to endless discussions. As long as the telecom provider still has the original identifying data, or can link them to identifying data, there can be no question of anonymous datasets that the provider can provide to other parties. On top of that, with new communication techniques like 5G and 6G, telecom providers collect much more detailed location data than they do now, which makes reliable anonymisation even more difficult.
Other safeguards built in by the Council are that providers must ensure that the data cannot be used to create individual profiles, that users are informed, and that they can object. Municipalities can appeal to this exception on the confidentiality of communications if they want information from telecom providers about dense traffic flows. This information can be used, for example, to decide where to build new cycle paths.
Although it seems that the privacy impact has been well thought through, there are still many snags. Also from an ethical point of view. If the police wants to enforce a curfew, or track down illegal parties, they can use the anonymous data to send teams to places where too many people gather at specific times. Just as TomTom anonymous data were sold to the police about times and stretches of road where people were driving too fast, so that the police could install speed cameras there. Then even data that are genuinely anonymous have a major impact on individual people.
Via an expansion of the cookie provisions in Article 8, the Council wants to allow companies to significantly infringe on individual privacy rights. The Council wants (via Article 8, paragraph 1 (g)) to provide a right to further processing of all data that is processed or stored in a device. "The processing and storage capabilities of terminal equipment and the collection of information from end-users' terminal equipment, including about its software and hardware."
That includes cookies and pixels, but also stored photos, passwords and contact lists and live access to camera, location data and microphone. In view of those contents, the safeguards such as encryption and pseudonymisation, immediately lose all meaning. Because I don't know what to make of pseudonymising my passwords and the photos of my dear son.
The Council has been strongly influenced by the media industry, particularly by newspapers and magazines. They have convinced the Council that the use of tracking cookies is strictly necessary to survive and that the media therefore should not have to ask for consent. So the Council writes in recital 21aa: "In some cases the use of processing and storage capabilities of terminal equipment and the collection of information from end-users' terminal equipment may also be necessary for providing a service, requested by the end-user, such as services provided in accordance with the freedom of expression and information including for journalistic purposes, e. g. online newspaper or other press publications as defined in Article 2 (4) of Directive (EU) 2019/790, that is wholly or mainly financed by advertising provided that, in addition, the end-user has been provided with clear, precise and user-friendly information about the purposes of cookies or similar techniques and has accepted such use." Whereby the word 'accepted' does not equate to 'consent', because if you use that latter term, you have to comply with all the requirements of the GDPR. In one fell swoop, the Council also wants to adopt a provision that tracking walls are allowed, provided there is an equivalent offer from the same provider that does not require consent for tracking cookies.
The European Parliament had tightened the European Commission's proposal on privacy by default, so that all kinds of software suppliers had to ensure that they only collected the minimum necessary data. The Council sees it differently. Browsers must enable users to give consent for many things at once (whitelisting), but not to refuse everything at once. Every website is allowed to ask a visitor for consent over and over again, because the Council says it is very important for self-determination that end users can decide for themselves about every consent request, instead of outsourcing such decisions to their software.
Bulk data retention of traffic data is a very sore point. For more than twenty years, law enfocement has been trying to introduce a bulk data retention obligation. This means that they want to store the metadata about calls, e-mails and preferably also the location data of all citizens for an as yet undefined future (law enforcement) purpose. Of course I am not opposed to targeted investigations, using the available digital resources. But a bulk retention obligation turns innocent people into permanent beacons, for you never know. Nor is bulk data retention necessary if law enforcement makes better use of its existing powers, such as freezing the data of people who may be involved in a crime. Precisely because of the major privacy risks and the lack of necessity, the highest court in the EU, the Court of Justice of the European Union, declared the previous European data retention obligation invalid in 2014. The court repeated its explanation in connection with national data retention obligations in England, France and Belgium in 2020. The Court explains clearly that the fundamental right to a private life, without fear of being constantly surveilled, may not be set aside completely. Such interference is only permitted on an individual basis, if it is really necessary, for instance if there is a suspicion of involvement in a crime.
When I still worked as a spokesperson for internet provider XS4ALL, I campaigned under the heading of mandatory objection against traffic data retention (word pun only works in Dutch). Later on I organised a petition against the data retention obligation in Europe together with other civil rights organisations, and still later on, working for the Dutch Data Protection Authority, I was the (co-)author on a national and European level of legislative advice, reactions and opinions in 2016 and 2017.
Fortunately, more people are upset about this attempt by the Council, especially at the insistence of France, to include this Trojan horse in the ePrivacy Regulation. I quote the boss of the Federal German Privacy Authority, Professor Ulrich Kelber. He writes that several red lines are crossed by this proposal and mentions the data retention obligation as main bone of contention, "which has already failed in so many courts." He concludes : "Wenn die ePrivacy-Verordnung so bleibt, wie der Rat der EU sie heute beschlossen hat, wäre das ein schwerer Schlag für den Datenschutz." (If the ePrivacy Regulation remains as the Council of the EU has decided today, it would be a serious blow to data protection.)
We already have ePrivacy rules for a long time. In the Netherlands they are implemented in chapter 11 of the Telecommunications Act. One of the reasons to draft the new ePrivacy Regulation was that - until recently - the scope of the rules was too limited: the rules to protect the confidentiality of digitale communiations only applied to a handful of providers of classic telephony and internet access services. Only the rules on cookies and spam applied to everyone.
Providers of internet communication services, such as WhatsApp, Signal, Gmail, Facebook, LinkedIn, Zoom and Teams, were not covered by the ePrivacy rules until now. Since the end of last year, ePrivacy rules do apply to these kinds of providers of over the top services. On 20 December 2020, the European Electronic Communications Code (EECC) directive entered into force, with a much broader definition of electronic communication services.
In sum, for the time being, no new ePrivacy Regulation will come into force. But perhaps, on reflection, the combination of the GDPR with the (extended) ePrivacy rules will suffice after all. The Advocate General of the European Court of Justice recently advised in a case of the Belgian DPA against Facebook that the national supervisory authorities responsible for enforcement of the GDPR may enforce ePrivacy rules nationally. Perhaps the European Commission and the European Parliament should therefore leave well enough alone and only work on expanding and strengthening supervisory capacities.