Deepfakes and the GDPR

July 30, 2025
Following the Danish example: ja eller nej? A Danish bill proposal made headlines recently. It gives Danes copyright to their voice, body and face in order to better combat deepfakes. The proposal has been received positively for the most part, but there is also criticism regarding its necessity: doesn’t privacy legislation already offer protection against deepfakes?

Deepfakes: the what and the why

According to the Dutch Data Protection Authority (AP), a deepfake is a video or audio clip that has been edited by an algorithm, but is often almost indistinguishable from the real thing. Deepfakes make it appear that something has happened that never actually took place.1 The vast majority of deepfakes are artificial pornography.2 In addition, deepfakes are sometimes used to spread political disinformation by making “politicians” say something. Or simply to make satirical videos starring famous people.

The AI Regulation defines deepfakes as ‘AI-generated or manipulated image, audio or video content that resembles existing persons, objects, places, entities or events and would falsely appear to a person to be authentic or truthful’.3

A brief overview of the proposed legislation

The Danish proposed legislation extends copyright to people's voices, bodies and faces. Deepfakes could infringe this copyright. Deepfakes intended as parody, satire or social criticism, among other things, are exempt.4 This is not an absolute exception; a balance must be struck between the interests of the person depicted and the creator of the deepfake. The proposed legislation imposes an obligation on social media companies to remove offensive deepfakes. If the platforms fail to remove a deepfake, they may be fined.5

One of the reasons for the proposed legislation is that current regulation on deepfakes is fragmented and unclear. It raises the question to what European privacy legislation regulates deepfakes.

It’s fake, right? So why privacy?

When you create a deepfake in which someone's face, body, or voice is recognisable, you are processing personal data. The fact that this data may have been synthetically generated or incorrect does not alter the fact that it can be traced back to a person. Organisations that want to create deepfakes, for example to make an ad, must therefore take into account the rules of the General Data Protection Regulation (GDPR).

Points of attention under the GDPR

Among other things, this means that a legal basis for the processing is necessary. The most obvious options are the data subject’s consent, i.e. the person depicted6, or the legitimate interests of the creator or a third party.7

A deepfake may fall under the prohibition in Article 9 of the GDPR on the processing of special categories of personal data. This is the case when the deepfake uses biometric data, such as a person's face or voice, for the purpose of identifying that person. Photos and videos are not automatically covered by this prohibition. Photos fall within the definition of biometric data ‘only when processed through a specific technical means allowing the unique identification or authentication of a natural person.’8 It seems logical to extend this reasoning to deepfakes, as they are essentially fake photos and videos. It seems that deepfakes, therefore, do not automatically fall within the scope of Article 9 GDPR. However, this is different when a deepfake is processed in an algorithm that performs facial recognition, for example.9

If the manipulated content suggests something about a person's race, health, sexual preference or political beliefs, the deepfake may also constitute special category personal data. In principle, special category personal data is subject to a processing prohibition. In order to create and share the deepfake, the maker of the deepfake then needs the explicit consent of the data subject.

In addition, you must also comply with all other requirements of the GDPR, such as transparency, data minimisation and security. However, the question is whether the data quality principle or the requirements of purpose and purpose limitation can be met at all when creating a deepfake.10

Data subject rights

The rights of data subjects are also important. Most relevant to data subjects in this context is the right to be forgotten: the controller must delete personal data if the data subject requests this. In this case, the controller is the person or organisation that creates and distributes the deepfake, but often also the platform on which this takes place.

The controller must also respect the right to information. The creator of the deepfake must therefore inform the person depicted that they are creating a deepfake of them.

The other rights of data subjects also apply as usual, so there is plenty to bear in mind. The use of deepfakes is legally risky and requires careful consideration.

Household exemption

The GDPR rules do not apply to processing within the context of household activities.11 This means that sharing and creating a deepfake within your personal circle does not fall within the scope of the GDPR. Please note: just because a social media account is personal and not commercial does not mean that the household exemption applies. Sharing content to public accounts is unlikely to be considered processing for purely personal activities. Consequently, you must comply with the GDPR. The household exemption naturally does not apply to business or commercial processing.

Regulation of deepfakes in other laws

The Danish lawmakers were not wrong – legislation on deepfakes is highly fragmented. This article focuses specifically on the obligations and rights arising from the GDPR, but it is important to note that other legislation may also apply.

For example, the AI Regulation imposes certain transparency requirements, individuals sometimes have the right to invoke their portrait rights, and certain forms of deepfakes are criminal offenses. In the case of pornographic deepfakes, their creation and sharing is prohibited and therefore punishable in the Netherlands.12 The same applies, for example, to telephone scams using deepfake voices.13 Under the Digital Services Act (DSA), providers of digital services, including online platforms and social media, must enable users to report illegal content.14 Providers of very large online platforms must also carry out a risk assessment that includes the risk of illegal content being disseminated through their services, and take measures to mitigate the risks they identify.15

Revising copyright law in the Netherlands?

It is unclear whether the proposed legislation also requires platforms to proactively remove deepfakes. In practice, the obligation for platforms to remove deepfakes seems no different from the obligation to comply with deletion requests that already exists under the GDPR. In addition, the scope of the proposed legislation is considerably smaller than that of the GDPR: it only applies to Danish territory, and satirical content is more or less excluded.

The GDPR applies to all personal data (including inaccurate data) and offers protection to everyone in the EU. The GDPR does not contain an exception for satirical content or social criticism, but the right to be forgotten is not absolute. A balance may have to be struck between the interests of the person in the deepfake (privacy) and the interests of the creator of the deepfake (e.g. freedom of expression).

One limitation of the GDPR is that it only applies to living persons. In the Danish proposed legislation, copyright on one's own voice, body and face would apply until 50 years after death. Additionally, it is not always clear whether the household exemption applies.

A law is only as strong as its enforceability: the biggest criticism of the GDPR in the context of deepfakes is the lack of enforcement. However, it is also not certain that the Danish proposed legislation will be enforced effectively. It therefore remains to be seen what the effects will be for individuals if the law is implemented.

Conclusion

From a legal perspective, there seems to be little need to follow the Danish example and extend copyright protection at this time. In fact, it would further fragment an already fractured regulatory landscape. The GDPR imposes strict rules on data controllers and also grants various rights to data subjects to protect their privacy. In addition, the creation of pornographic deepfakes is a criminal offence.

For organisations, this means that if you want to create a deepfake, you should think carefully about the grounds, the principles of the GDPR, the risks and the rights of data subjects. For individuals, this means that if you recognise yourself in a deepfake, you have the right to transparency, objection and, in many cases, deletion.

Need advice on (complex) data protection issues? Discover our services. Do you require an introduction to the GDPR? We provide training courses.

1. Dutch Data Protection Authority, Deepfakes, URL: https://www.autoriteitpersoonsgegevens.nl/themas/internet-slimme-apparaten/beeldmateriaal/deepfakes, accessed on July 18, 2025.

2. Bart van der Sloot, Yvette Wagensveld, Bert-Jaap Koops, (TILT), ‘Deepfakes: de juridische uitdagingen van een synthetische samenleving’, (2021)

3. AI Act, Art. 3(60).

4. Luca Schirru, InfoJustice, Danish Bill Proposes Using Copyright Law to Combat Deepfakes, URL: https://infojustice.org/archives/46588#95246d81-3982-4261-b3c1-ccb36611baeb, accessed July 18, 2025.

5. The New York Times, Denmark Aims to Use Copyright Law to Protect People From Deepfakes, URL: https://www.nytimes.com/2025/07/10/world/europe/denmark-deepfake-copyright-ai-law.html, accessed July 18, 2025.

6. General Data Protection Regulation, Art. 6(1)(a).

7. General Data Protection Regulation, Art. 6(1)(f).

8. General Data Protection Regulation, Recital 51.

9. See, for example: Autoriteit Persoonsgegevens, Decision to impose fines and orders subject to a penalty for non-compliance, 16 May 2024 (Clearview boete), paras 35-36.

10. Bart van der Sloot, Yvette Wagensveld, Bert-Jaap Koops, (TILT), ‘Deepfakes: de juridische uitdagingen van een synthetische samenleving’, (2021), p. 4-5.

11. E.g.: CJEU C-101/01 (Lindqvist).

12. Dutch criminal code, Art. 139h.

13. See, for example: Biometric Update, ‘Hi mom, it’s me’: voice cloning services demand stronger voice deepfake detection, URL: https://www.biometricupdate.com/202503/hi-mom-its-me-voice-cloning-services-demand-stronger-voice-deepfake-detection, accessed July 21, 2025.

14. Digital Services Act, Art. 16(1).

15. Digital Services Act, Art. 34-35.

Download
Suzette
Consultant