Good news! Since the end of May, Facebook offers a data processing agreement for its advertising service Custom Audiences! But is the use of this service now privacy proof, and may organisations now use this service without the consent of their customers and visitors? Unfortunately, the answer is not yet clear.
Through Facebook Custom Audiences, organisations can show ads on Facebook to their own customers or visitors. They upload data of their customers or visitors to Facebook, such as name, address, date of birth, telephone number, email address or unique identifier from a Facebook cookie. They upload the data in a changed format (hashed data). Facebook undoes the hashing on its servers, and finds the corresponding Facebook account data. Thus Facebook can show the ad on the personal profile of the customer or visitor of the advertising organisation. If you thought hashing was a process of anonymisation, please think again. The way the hashing has been implemented by Facebook, it is merely a security measure during the transfer of the personal data.
In November last year, the Dutch Consumer Union raised the alarm about Custom Audiences (in Dutch only). According to the Consumer Union organisations were not allowed to use this service without the prior consent of customers/visitors. The Union investigated 17 Dutch companies, and saw that none of them had asked for separate consent. The Dutch Data Protection Authority confirmed that this was not allowed at the time (in Dutch only). The advertisers could not use the legal ground of legitimate interest and Facebook did not offer a data processor agreement in which processing for Facebooks own purposes was excluded.
In its regular advertising process, Facebook uses every advertiser decision, whether to include or exclude people from a target audience, to enrich the corresponding user profiles. Facebook offers an explicit choice for organisations to exclude specific audiences. This use of the data by Facebook may lead to stigmatisation. This for example can happen if Facebook profiles a person as a sufferer of a (“interested in”) creepy disease, if that person was shown a relatively innocent ad as existing customer of a charity. It may also lead to disadvantaging, if other organisations exclude this person from certain offers, and to unfair influencing, if certain political messages are withhold from that person. Many of these risks can be prevented if Facebook only acts as a data processor for Custom Audiences. Of course individual advertisers may also cause unfair disadvantages, undue influence or discriminatory exclusion of people, but at least these negative consequences would not be magnified on Facebook to all other advertisers.
Please note: for other advertising possibilities, such as advertising on comparable (look a like) audiences, and for the regular selection on advertising audiences, Facebook may never act as a data processor. With regard to the regular types of advertising, Facebook itself determines the purposes and the means of the processing, and therefore is a data controller.
So, if they accept the new processing agreement, may organisations rely on the legal ground of their legitimate interest to show targeted ads on Facebook to their customers and website visitors? The answer, unfortunately, is not black and white, not a simple Yes or No.
The legal ground of legitimate interest is one of the six legal grounds that allow organisations to process personal data. This legal ground (Art. 6(1) f of the GDPR), consists of three parts.
The use of personal data to send ads, also known as direct marketing, has explicitly been acknowledged as a legitimate interest since the introduction of the previous European data protection directive. It is essential in a free market economy that organisations may tell both their existing customers and potential new customers about their competitive services. Otherwise they would never be able to draw people to their new products, and people would not be able to switch to a more customer friendly organisation. In the GDPR the legitimate interest in performing direct marketing is mentioned in a separate recital, number 47.
In order to be able to rely on the legal ground of legitimate interest, having a legitimate interest however is not enough. The company must also pass the other two elements of the balancing test. This can only be assessed by taking all the circumstances of the case into consideration, such as for example the nature of the personal data, the frequency of the marketing, the age of the recipients, the understandability of the information provided to the recipients about the marketing and for example the availability of an easy opt-out. To mention just one example: you are unlikely to be able to rely on your legitimate interest if you are a manufacturer of sanitary pads and you send targeted postal ads to 11 year old girls. This actually happened in the Netherlands! (Story in Dutch only).
Organisations that wish to engage in direct marketing must take the different rules into account for the different channels, contained in the ePrivacy Directive and the GDPR. In sum, they were and are allowed under the GDPR and the (Dutch implementation of the) ePrivacy rules to send unsolicited postal ads and make unsolicited telemarketing calls, as long as the annoyance is limited, and the infraction on the right to the protection of the rights and interests of the recipients is not noteworthy. But if organisations want to use identifiers from the tracking cookies or pixels from Facebook, they must first ask the potential customers or visitors for consent to read these data. At that moment, they must also separately ask for consent to use these data to show targeted ads on Facebook.
At face value, it is possible to consider the Custom Audience service on Facebook as the digital successor of snail mail advertisements. But that would only apply if Facebook, like the mail man, would not scan the contents of the message, and would not keep a record of the kinds of mail, frequency and senders. That could be the case if Facebook only acted as a data processor. But it is not clear if this is the case. The new data processing agreement explicitly mentions a number of activities that Facebook will not undertake, but that is not the same as a limitative list of purposes defined by the data controller. On top of that, organisations should be able to add questions to the annual audit that Facebook promises to organise. One of the questions that need to be answered is: “Can it be reasonably excluded that Facebook uses the data for its own purposes, such as improvement of its services, including the testing of new services, and for research purposes?”
In order to further ensure that the legitimate interest of the organisation outweighs the rights and interests of the recipients, organisations must also actively inform their existing customers about their intention to use data to show targeted ads on Facebook. They must give users the opportunity to easily object against this use of their data. On top of that, organisations should prevent that any of the recipients feel ashamed or feel spied upon, because of the nature of the advertised products or services.
But it is also possible that the Dutch data protection authority sticks to its initial judgement that the use of Custom Audiences is only allowed based on prior consent, regardless whether the ads are targeted at new or existing customers and regardless of the technical way the data were collected. That point of view has already been voiced by the regional Bavarian data protection authority for the privacy sector (in German only). It is the question if the other supervisory authorities will agree. This will undoubtedly lead to a breathtaking shoot out in the O.K. Corral (the Board, formerly known as the Article 29 Working Party).