The rapid advancement of artificial intelligence (AI) has made it increasingly easy, and widely accessible, to create highly realistic “deepfakes”. These are manipulated images, videos, or audio recordings that convincingly replicate a particular person’s appearance or voice.
Positive Use of Deepfakes
This technology has significant and legitimate creative and commercial applications, subject (where applicable) to the consent of the person or their estate in the event they are deceased. Some examples include:
|
Sector |
Benefit |
|
Accessibility and Assistive Technology |
|
|
Media production |
|
|
Entertainment |
|
|
Education and Training |
|
|
Privacy |
|
|
Marketing and Advertising (subject to the consent of the subject person) |
|
Misuse of Deepfakes
While this technology has positive benefits (as detailed above), it also presents significant risks to individuals and organisations. Deepfakes can be used to commit fraud, spread misinformation, damage reputations, enable harassment, or impersonate individuals without their consent.
For example:
|
Area |
Method |
|
Non-consensual pornography |
|
|
Fraud and financial scams |
|
|
Political manipulation and misinformation |
|
|
Cyberbullying and harassment |
|
|
Identity theft |
|
|
Reputational damage |
|
|
Extortion and blackmail |
|
The online misuse of personal images and likenesses is a serious risk to privacy, identity protection, and digital security, particularly as manipulated content can be difficult to detect and may circulate widely or ‘go viral’ before it can be challenged or removed.
Copyright in your image
The concept of images rights is one that exists in many jurisdictions around the world and, broadly, are a group of rights relating to an individual’s personality. This includes their image, likeness, or characteristics associated with them. The primary function of image rights is to prevent unauthorised persons from exploiting or making use of an individual’s image.
Image rights have long formed the basis of commercialised revenue streams for celebrities and famous people, in which their name, signature, likeness, silhouette or other identifying feature is commercialised on everything from coffee to jawline strengthening face massagers:

De’Longhi‘s “Perfetto” global campaign - 2022

Facial Fitness Pao advertising campaign - 2014
While copyright in a photo of a person can be established relatively easily, the concept of copyright in a person’s image has been more problematic.
Alright, Alright, Alright
Until relatively recently, the incidence of misuse of another person’s image was generally confined to celebrities and famous people, and whether they had secured trade mark protection for their name, signature or other identifying indicia was a key factor in their means of enforcement.
For example, the actor Matthew McConaughey (through his company J.K. Livin Brands Inc.) has registered various trade marks in the United States across a range of goods and services for various indicia associated with him. This includes his catchphrase ‘Alright, Alright, Alright’ and also motion marks depicting him in various poses.
A recent example is a 7 second long motion mark incorporating the images on the right:
|
|
US registration no. 7931810 in class 9 for |
These and other registrations are steps, by Matthew McConaughey and others applying a similar strategy, to prevent misuse by AI of a likeness and image. However, not everyone has the resources to engage in a multi-mark/multi-class trade mark registration strategy.
The rapid emergence of AI, both in technical ability and ease of access, means that practically anyone can engage in generating a ‘deepfake’ (admittedly of varying quality) but, crucially, if the user has any online presence (which is generally true for most people in today’s digital world), that same person can themselves become a victim even if they are not a celebrity.
What is the common person on the street to do?
A Danish approach
In June 2025, Denmark’s Minister for Culture, Jakob Engel-Schmidt, announced proposed amendments to the Danish Copyright Act addressing the use of AI-generated content which replicates individuals’ faces, bodies, and voices.
The purpose of these changes is intended to safeguard people’s reputations and digital identities from misuse through generative AI. Two new provisions have been introduced, affecting both performing artists and the general public:
|
Section 65a |
|
|
Section 73a |
|
Under the proposed law, individuals will hold copyright-like protection over representations of their own likeness, including for 50 years after their death.
However, the provisions do not apply to caricature or satire, unless such content amounts to misinformation intended to cause serious harm.
Although the proposed legislation primarily targets the use of photos, facial images, and voices, it applies broadly to any recognisable depiction of a person, including distinctive physical features such as silhouettes.
The proposed amendments grant Danish citizens three principal rights:
|
Right of removal |
|
|
Right to compensation |
|
|
Platform liability |
|
A UAE approach
Whether, and to what extent, other countries follow the approach of Denmark will be become apparent in the near future.
From a UAE perspective, the long-standing emphasis on privacy, unauthorised use of imagery, and damage to reputation has been addressed within various national laws for many years.
In a 2024 article, we explored the provisions of UAE law in relation to unauthorised photography in the UAE (see https://hadefpartners.com/news-insights/insights/no-photos-no-videos-is-photographing-or-videoing-of-people-in-a-public-place-illegal/). As an extension of this, and with regards to deepfakes, the provisions of the Federal Law by Decree No. (31) of 2021 Promulgating the Crimes and Penalties Law (Penal Code) are particularly helpful.
Article 425 of the Penal Code provides that:
Shall be punishable by confinement for a period not exceeding (2) two years or by a fine not exceeding (20,000) twenty thousand Dirhams any individual who, through any means of publicity charges another person with an incident susceptible of making him subject to punishment or exposing him to public hatred or contempt.
The punishment shall be confinement and a fine or one of these two penalties if the act of libel is committed against a public officer or any person to whom a public service is assigned, during the performance of his function or public service, due to or on the occasion of such performance; or if the act of libel violates the honour or takes from the reputation of families, or if it is noticeable that the act is intended for the attainment of an unlawful objective.
In the event where the act of libel is expressed by publication in newspaper or printed matters, this shall be considered a circumstance of aggravation.
In combination, Article 43 of Federal Decree-Law No. 34 of 2021 Concerning the Fight Against Rumours and Cybercrime (the Cybercrimes Law): provides that:
Whoever uses an information network, ITE, or an information system and insults another or attributes a quality to him that would make that person subject to punishment or contempt by third parties shall be punished with imprisonment and/ or a fine of not less than (AED 250,000) two hundred fifty thousand dirhams or more than (AED 500,000) five hundred thousand dirhams.
These provisions, which fall within defamation principles, are broad in nature and intended to give flexibility in the protection of the reputation and honour of both an individual and also of their families.
As a result, where a deepfake has the effect of exposing the victim to “punishment or exposing him to public hatred or contempt” that would arguably fall within Article 425 of the Penal Code and, if done via an electronic network (such as the via the Internet, WhatsApp or other social media conduit) also within the provisions of the Cybercrimes Law.
What is particularly helpful with these provisions is the criminal element and access to justice.
While possible for a victim to pursue a civil claim for damages (as in other countries), the ability for a victim to file a criminal complaint means that a lack of financial resources is not a barrier to protecting their rights.
In this regard, and from an access to justice perspective, the legislative framework in the UAE currently provides an effective and accessible means to both deterring and combatting deepfakes.
It will be interesting to see what extent other countries begin to deploy a similar approach, either alone or in combination with changes to national copyright laws as are currently being considered in Denmark.
Thanks to Katie Meldrum and Naduli de Silva (interns) on the research for this article.
