Human rights impact assessment of Facebook Pages

November 17, 2022

If you want to know how a service affects human rights, you have to perform a 'HRIA', a Human Rights Impact Assessment. The Dutch central government has already developed its own model to analyse the human rights impact of an algorithm, the IAMA model. Using this model, Privacy Company performed such an analysis for the Ministry of the Interior and Kingdom Relations (BZK) on the government's use of Facebook Pages. The HRIA is published at tweedekamer.nl. Privacy Company also conducted a 'DPIA', a Data Protection Impact Assessment on the use of Facebook Pages. 

The HRIA looks at four types of human rights:  

  1. person-related human rights, such as right to personal identity, autonomy and protection of good honour and name,  
  2. freedoms, such as freedom of expression, freedom of religion and freedom of protest,  
  3. equality rights, such as the right not to be discriminated against or profiled, and  
  4. procedural rights, such as the right to and access to a fair trial.  

One problem with this HRIA was that Facebook did not provide insight into the algorithms it deploys and into the data it uses in the process. Privacy Company therefore had to observe from the outside. This made it often impossible to make a firm statement about the impact. Privacy Company was, however, able to show that there is a potential impact and describes ways to measure that impact in collaboration with Facebook.

The HRIA identifies four ways in which government use of Facebook Pages can impact human rights:  

  1. Facebook can, through unbalanced representation of minority groups and unbalanced representation of opinions, negatively impact the right to identity and autonomy.  
  2. Facebook, through favouring certain opinions over others, can have a negative impact on freedom of information, expression and can have a discriminatory effect.  
  3. Governments from outside Europe can use Facebook to gather information on, for example, dissidents who have fled to Europe and take action against them outside the European legal order.
  4. The Dutch government has no way of explaining how Facebook's actions came about and has no way of correcting any discriminatory actions by Facebook.  

The investigation found that Facebook favours certain opinions over others. Facebook also does not explain its algorithms and offers insufficient options for the government to correct discrimination. Privacy Company could not rule out two of the four ways Facebook can impact human rights, but also could not prove them. For that, Facebook is not transparent enough. All in all, this means there is a high risk of human rights violations if the Dutch government uses Facebook Pages.  

Privacy Company has extensive experience in investigating the data protection risks of services, in DPIAs. This is the first time Privacy Company has performed a thorough analysis of the human rights impact of a service. The national model IAMA was a good starting point, but proved incomplete in practice with regard to Facebook's highly complex use of algorithms. Privacy Company has therefore extended this model with an analytical framework for the multiple ways in which a service can have an impact on human rights. An HRIA thus provides good insight into how to deploy a service responsibly.  

We publish this blog about our findings with the permission of the Ministry of the Interior and Kingdom Relations. For questions about the analysis, please contact the ministry's press spokesperson, Thomas van Oortmerssen 0031 6 31 01 97 81.  

If you would like to know more about the deployment of HRIAs and the IAMA model, please contact: Winfried Tilanus

Download
Winfried
Consultant