New DPIA for the Dutch government and universities on Microsoft Teams, OneDrive and SharePoint Online

February 21, 2022

Commissioned by the Dutch Ministry of Justice and Security and SURF, the ICT procurement organisation for universities, Privacy Company conducted a new investigation into the privacy risks of Microsoft Teams, OneDrive and SharePoint. The outcome is that Microsoft has taken measures to remedy the six high risks, but organisations should not use these cloud services to exchange or store sensitive and special categories of personal data. They may only do so if they can encrypt the content with encryption keys under their own control. This measure is necessary because of the high risk of possible access to those data from the United States. This risk remains even after 2022, when  Microsoft will process almost all personal data of its European Enterprise and Education customers exclusively in European data centres.

We publish this blog with permission from the Ministry and SURF. You can read the complete DPIA (in English) here. For questions about the DPIA, please contact SLM Rijk (Strategic Supplier Management Microsoft, Google and AWS Rijk), which can be contacted via the Ministry of Justice press office, 00 31 70 370 73 45.


Teams, OneDrive and SharePoint

Microsoft Teams allows people to make videocalls and share information in permanent chat channels, with people inside and outside their organisation. The OneDrive and SharePoint services are used for storing and sharing files in the Microsoft cloud. Teams, OneDrive and SharePoint are cloud services. The DPIA does not address local use of OneDrive and SharePoint, on premises.

To use the online services, users can install software on their own devices or log in via a browser. This DPIA examines the data traffic from the three services from installed applications on the operating systems MacOS, Windows, iOS, and Android, and through a Chrome browser. To log in, the Microsoft Azure Active Directory was used (a kind of online phone book, containing the login names and passwords of all users).


Risks of transfer to the United States

The new DPIA is mainly about the risks of collecting and processing so-called Diagnostic Data, that is, data about the individual use of the services. For example: how often you call who via Teams, what kind of pictures you add in the chat or on an intranet page, and what kind of documents you write, read and share. Additionally, the report addresses the privacy risks of using Microsoft's cloud for the Content Data you can share through these services.

Diagnostic Data

Microsoft collects Diagnostic Data in several technical ways, through system-generated server logs on its own cloud servers and through the so-called telemetry client built into the Teams, OneDrive and SharePoint software. That client is programmed to systematically collect telemetry data on the end user's device (or, from the browser in Office for the Web) and send it periodically to Microsoft's servers in the US. The Diagnostic Data is different from, and technically distinguishable from, the functional data that Microsoft must process (temporarily) to enable users to use Microsoft's online services over the Internet.


Special privacy terms for Dutch government and universities

SLM Rijk concluded new privacy terms with Microsoft in early May 2019 for the 300,000 digital workstations of the central government. In late 2019, SURF concluded the same terms for the Dutch  universities. The new terms apply to the Enterprise and Education versions of the Office software in use by the ministries, the tax authorities, the police, the judiciary and affiliated independent administrative bodies. Since, Microsoft has been acting only as a data processor for all of its online services. Microsoft may only process the personal data for three purposes, and is prohibited from  processing any personal data for profiling, data analytics, market research or advertising. Microsoft has provided effective audit rights to the Dutch government. Privacy Company has performed repeat inspections since 2019 to verify that Microsoft was living up to its commitments, and had taken the promised improvement measures.


Six low privacy risks for diagnostic personal data

The result of this DPIA, after repeated consultations with Microsoft, is that there are no more known high risks resulting from the processing of the diagnostic data, if the system administrators of the organisations follow the previous recommendations about using Office 365. However, there is a high risk if organisations use Teams to exchange highly sensitive and special categories of data, due to potential access by law enforcement and security agencies in the US. The six low risks are:

  1. Loss of control and re-identification of pseudonymous personal data due to the structural transfer of telemetry data to the US. The study shows that this traffic flow is limited in scope, and contains only pseudonymous personal data (with one exception, see risk #2). Starting in 2023, organisations can choose to have all Diagnostic Data, Support tickets, and Account Data processed exclusively in the EU. Even after 2022, Microsoft will still transfer some personal data to the US to detect and resolve security incidents, but these transfers will be incidental, not structural, and will generally involve only pseudonymized and aggregated data.
  2. Loss of control and improper further processing due to the occasional processing of pathnames, usernames and email addresses in specific telemetry messages about OneDrive. Microsoft explained that this is the only exception to the rule that telemetry messages only contain pseudonymous data. Microsoft explained why this may be necessary in OneDrive, such as in the case where multiple users are accessing the same document at the same time. Microsoft also explained that access to these Diagnostic OneDrive Data is controlled, limited to the just-in-time security group, that the data are collected through a separate endpoint, and are never kept for more than 30 days. System administrators themselves can also implement risk mitigation measures, such as using pseudonyms in the Azure AD, and establishing policies that employees should not use personal data in file and path names.
  3. Loss of control and lack of transparency on browser Telemetry Data, and the use of so-called Connected Experiences. Microsoft calls this category of Telemetry Data Required Service Data. Even if an organisation has centrally minimised the collection of telemetry, by selecting the 'Neither' level, Microsoft collects the same amount of Required Service Data. According to Microsoft, these data are too dynamic or too business-confidential in nature to publish in detail, but it does provide access (when available) through Data Subject Access Requests (DSARs). Microsoft reiterated that it only processes these data for the three authorised processing purposes.
  4. Limited data subjects access rights. The DPIA shows that Microsoft’s access tool for administrators to the individual Diagnostic Data produces results that are difficult to understand. In response to this finding, Microsoft has pledged to better assist administrators with access requests, even if that requires explaining why some Required Service Data are not personal data or are no longer available because they were immediately stripped of all identifiers.
  5. Unauthorised further processing by third parties. The repeat inspections by Privacy Company revealed that Microsoft was passing traffic to a number of third parties through built-in services in Teams, OneDrive and SharePoint. Microsoft has largely remedied this risk by either entering into a subprocessor agreement or by allowing system administrators to centrally block traffic (e.g., disable images from Giphy, including for guest users). This study found that there was still traffic to Microsoft's search engine Bing if an Enterprise or Education customer wanted to insert an image into a SharePoint page. Microsoft is an independent data controller, with its own commercial purposes, for data processing through Bing. Therefore, this traffic should not occur if administrators have centrally disabled these types of "Microsoft-as-controller" services (the Additional Optional Connected Experiences). Microsoft has committed to removing traffic to Bing from SharePoint by July 2022.
  6. Employee monitoring system: chilling effect. Microsoft offers two different analytics services for Teams: Teams Analytics & Reports and Viva Insights. The first tool (Teams Analytics & Reports) provides detailed insights to system administrators about individual work behaviours. System administrators at universities and government organisations can mitigate this risk by largely disabling this functionality. Microsoft is not willing to change the default setting. The other tool, Viva Insights, does have a more privacy-friendly configuration. It is off by default. This tool includes MyAnalytics and Workplace Analytics. These are services that provide employees with information about their productivity, and offer managers insight into the work patterns of individual employees. If an administrator explicitly enables the service, the individual user still has the option to opt out.
     

High risk for special categories of personal data not encrypted with a self-controlled key

Microsoft is a U.S. company, and according to the European Court of Justice (Schrems-II), U.S. national security laws do not provide sufficient legal protection for Europeans if their personal data are intercepted or when disclosure is ordered. This risk occurs even when these data are processed and stored exclusively in the EU, as access to these data can be compelled through US legislation such as the US CLOUD Act. The fact that Microsoft applies encryption to all customer data in transit over the Internet, and to stored files, cannot eliminate this risk either. As long as Microsoft has access to the key, although it seems mostly theoretical, it can be compelled to decrypt and disclose the data.


The most important mitigation measure for organisations in Europe against this risk of mass surveillance is to encrypt the data with a key under their own control, which even a vendor like Microsoft does not have access to. Microsoft does offer end-to-end encryption (E2EE) for storage of commonly used file formats in OneDrive and SharePoint, but not yet for calls in Teams with more than two people, only for unscheduled one-to-one calls.


Thus, organisations should not exchange sensitive or special categories of personal data through Teams unless the data is inherently public (such as public lectures or some court cases) because they have no control over the encryption keys. To protect sensitive and special categories of personal data in OneDrive and SharePoint, organisations can use Microsoft's Double Key Encryption (DKE). Privacy Company has written a public report on DKE (in Dutch only) at the request of the NBV of the AIVD.

Recommended actions for public sector organisations

  • Do not share highly sensitive or special categories of personal data through Teams unless it is a public meeting
  • Enable E2EE in Teams for 1-on-1 conversations, and enable E2EE for all meetings and chats as soon as Microsoft supports it
  • Establish policy rules for the sharing of personal data in Teams and OneDrive that all participants, including guest users, must accept
  • Use Microsoft Double Key Encryption for storing files with highly sensitive or special categories of personal data in OneDrive or SharePoint. This is not possible for recordings of Teams conversations, unless the recordings were made on an on-premise server. Use Customer Key and Customer Lockbox for other stored personal data
  • Consider using pseudonyms for employees whose identity is confidential, including when using Azure AD as Single Sign On to other companies' services
  • Do not use SMS for authentication to prevent the transfer of unencrypted cell phone numbers. Instead, use the Authenticator app or a hardware token
  • Disable the Additional Optional Connected Experiences in Office365
  • Prohibit access to third-party applications in the app store in Teams
  • Set the telemetry collection in installed applications to the lowest level
  • Set the telemetry collection in Windows to the lowest 'security' level
  • Warn end users not to insert images into SharePoint via the built-in Bing search engine for the next six months
  • Do not use the new Teams Analytics & Reports service: at least opt for pseudonymous viewing
  • Establish policies to prevent Microsoft's analytics services from being used as employee monitoring systems
  • Disclose the organisation’s retention policies and enforce compliance, delete outdated data (to mitigate risks of access from the U.S.)
  • Establish policies to prevent file names and file paths from containing personal data
  • Inform employees about their data subject access options through the Data Viewer tool and by submitting a Data Subject Access Requests to the organisation's administrator(s)
  • Use Microsoft's Data Viewer tool yourself to view the diagnostic data and compare the results with your own analysis of outbound network traffic from a test environment.

Conclusions

Since June 2019, as a result of its negotiations with SLM Rijk and SURF, Microsoft has taken many legal, technical, and organisational measures to mitigate the risks for data subjects due to processing of personal data through the use of Teams, OneDrive, SharePoint, and the Azure AD. In response to the initial findings of this DPIA, Microsoft has committed to mitigate a number of shortcomings., and has provided significant assurances about its data processing.


Given the risks of using U.S. cloud providers, Microsoft must make more adjustments and improvements to mitigate the remaining high risk and the six identified low risks. Microsoft must disclose when it will enable E2EE for all Teams exchanges. In addition, Microsoft must become more transparent about the content of Required Service Data and organise an audit on its compliance with the agreed purpose limitation and retention periods for that personal data. Finally, Microsoft must comply with the requirement that default settings be privacy friendly (data protection by default). That means Microsoft must allow system administrators to actively turn on new analytics services, based on clear information.


Caveat


It is uncertain how national data protection authorities will assess the transfer risks in their joint investigation into the use of cloud services by public sector organisations. The results are expected by the end of 2022. For this DPIA the transfer risks have been rigorously assessed, including a separate DTIA. If necessary, the DPIA and DTIA will be updated in 2023.

If the EDPB were to assess the transfer risks as high, even after Microsoft completes its EU Data Boundary, organisations in the Netherlands will effectively no longer be able to use services from US providers, and the consequences will be much greater than just using these Microsoft services.

Read more

  • Read the complete DPIA (in English)
  • Read the DTIA (7 PDFs, in English)
  • Read the summary of the DTIA from SLM Rijk
  • Read the Q&A about applicable US law from Greenberg Traurig

Read our previous blogs about DPIAs on Microsoft Office:

Download
Sjoera
Consultant