Commissioned by the Ministry of Justice and Security, Privacy Company investigated the privacy risks of G Suite Enterprise, with work and communication apps such as Gmail, Chat, Meet, Forms, Docs and Slides. Meanwhile, Google has renamed these services into Google Workspace.
We publish this blog about our findings with the permission of the Ministry of Justice and Security. The minister has informed the Dutch Lower House about the outcomes of this DPIA (letter in Dutch only). For questions about the DPIA, please contact SLM Microsoft Rijk (Strategic Supplier Management Microsoft Rijk), who can be reached via the press office of the Ministry of Justice, 070 370 73 45.
Privacy Company conducted technical and legal research into the data Google processes when Google Workspace is used on mobile phones with the iOS and Android operating systems, on a Chromebook running ChromeOS, on a Macbook and on laptops with Windows 10. You can use the services by installing separate apps, or you can use them online with a browser. Google's own Chrome browser was used in this Data Protection Impact Assessment (DPIA). The DPIA also analyses what happens when you use built-in microservices (Features) such as the spell checker, and when you use other well-known Google services such as Maps, the search engine and YouTube from within the core services.
SLM Rijk, after completion of the DPIA, has negotiated with Google to tighten the privacy terms in order to limit the high risks. The amended privacy terms apply to the Enterprise version of the Google Workspace services. The government wants to create the possibility of using these services in the future as an alternative to the Microsoft Office services.
As a result of the negotiations, Google has clarified its role as data processor for the so-called Customer Content Data. These are the content data that customers actively upload to the Core Online Services, such as the content of video calls, e-mails and documents. The role of processor also applies to the use of built-in microservices such as the spell checker (Features) and the use of the Google work account in the core Google Workspace services. In the Core Services, Google limits the processing of the Customer Content Data to three purposes.
The three purposes are:
1. to provide and improve the Services and Technical Support Services subscribed to by Customer;
2. to identify, address and fix security threats, risks, bugs and other anomalies
3. to develop, deliver and install updates to the Services subscribed to by Customer (including new functionality related to the Services subscribed to by Customer).
There is also agreement that Google may not use the content data and the usage data from the Core Services for advertising, profiling, data analytics and market research purposes. The Dutch government has negotiated effective audit rights, so that it can audit Google's compliance with the improved privacy terms and applicable privacy laws and regulations.
The DPIA, which Privacy Company conducted between December 2019 and June 2020, showed that there were 10 high and 3 low privacy risks resulting from the use of G Suite Enterprise. Google has taken or announced a number of measures over the past six months to mitigate those risks. However, even if Google has properly implemented all announced measures, there are still 8 high privacy risks (in addition to the 3 unchanged low privacy risks).
Many of the identified privacy risks identified stem from Google's position that it may process the information it receives about employee behaviour for its own purposes. In fact, Google considers itself to be an independent controller for the personal data on the individual use of the online services, the Diagnostic Data. The same applies to the content of (and information on) support requests that employees submit to Google, and comments that users submit via the Feedback form. Google calls these three types of data (the Diagnostic Data, the Support Data and the information from the Feedback Form) Service Data. The DPIA report concludes that Google does not have the freedom to determine its own purposes, because it can only obtain the Service Data if the State first entrusts it with Content Data in a role as data processor.
At the end of November 2020, Google published a new privacy notice on the processing of Service Data, the Google Cloud Privacy Notice. In it, Google describes 17 of its own purposes for the processing of personal data. The fact that Google mentions these purposes is an improvement but does not solve the problem that the State loses control over the personal data of its employees if the State allows Google to process these data for its own commercial purposes. These purposes are broad and unclear, such as showing recommendations, combining the Service Data with information from (unknown) other Google products and services and using algorithms to recognise patterns. Moreover, Google can change the purposes for the processing of the Service Data at will, by amending the privacy statement, as long as it does not process the data for the four 'prohibited' purposes of advertising, profiling, data analytics and market research.
The DPIA report concludes that despite the agreed purpose limitation, Google does not factually act as a processor for the Content Data either. The reason for this is that Google is not willing to limit the processing of these Content Data to what is strictly necessary, while the three purposes mainly relate to the processing of the Diagnostic Data to improve, maintain and secure the services. The Content Data seem to be needed only to provide a service such as spell-checking, and to answer content-related support requests.
These two problems with Google's role lead to yet another privacy risk, namely that the government organisations do not have a legal ground for the data processing. The DPIA report explains that the government organisations cannot rely on the consent of the civil servants, on the necessity to perform a contract with the employees, or on a public or legitimate interest if they want to transfer personal data to a third party with whom they do not have a valid privacy agreement.
Other privacy risks for government employees are related to the lack of information that Google is required to publish under European privacy legislation about the exact types of personal data that Google collects through telemetry, through the use of its website and in its cloud log servers, for which specific purposes it processes these personal data, the retention periods, and the parties with which it shares these personal data. In reply, Google has promised to publish information on the content of the telemetry data by the end of 2021, so that the State can then verify this processing (or have it verified) in an audit. Google has also published a guide for system administrators, explaining the rules and the available privacy choices, the Google Workspace data protection implementation guide.
The eight high residual risks are:
1. Lack of purpose limitation Content data (no limitation of processing to what is strictly necessary): loss of confidentiality of personal data, loss of control, risk of re-identification
2. Lack of purpose limitation Diagnostic data (also for the separate data stream from ChromeOS and the Chrome browser): loss of control, unlawful processing
3. Lack of transparency Content Data (not visible what Content Data Google collects via telemetry, Chrome Enhanced spellcheck not easily identifiable and cannot be turned off on all devices without additional cost): loss of control
4. Lack of transparency Diagnostic Data (promise of information on telemetry content by the end of 2021, but no information on retention periods and processors/third parties): loss of control and risk of re-identification
5. No legal ground for Google and the government organisations (Google not a processor and does not accept joint controllership, no commitment to comply with cookie rules): loss of control, unlawful processing
6. Lack of privacy controls for administrators and users (no ‘Off’ button for telemetry and for the Feedback form): loss of control and loss of confidentiality
7. Lack of control on transfer of Diagnostic data to processors and third parties: loss of control, loss of confidentiality
8. Impossibility for government employees to exercise their data subject access rights
The three remaining low risks are:
1. Use of a Cloud provider (possible unlawful access to content and metadata): loss of control, loss of confidentiality, re-identification of pseudonymised data and unlawful (further) processing
2. Employee monitoring system (use of available log data by government organisations to assess employee performance): chilling effect to exercise (related) rights
3. Impossibility to delete historic diagnostic data: increased risk of re-identification of pseudonymised data and unlawful (further) processing
On 15 February 2021, SLM Microsoft Rijk applied for a prior consultation to the Dutch Data Protection Authority. The requirement for prior consultation is imposed by the GDPR in Article 36 if an organisation concludes in a DPIA that there are high residual risks that it cannot resolve by itself.
In response to the revised DPIA (with the new table of residual risks), Google has indicated that it is "willing to assess the feasibility of changing its role for Service Data in the future".
The how and why of the privacy risks is explained in the new English DPIA report for SLM Microsoft Rijk.
The minister of Education has informed the Dutch Lower House (letter in Dutch only) about the results of the DPIA simultaneously conducted by Privacy Company on G Suite (Enterprise) for Education (the free and the paid versions of the Google services for schools and universities). This report concludes that the same 8 high data protection risks also apply to the use of the Google services for education.