The UAE’s Child Digital Safety Law 2025: A New Era of Online Protection

Children and young people represent one of the most vulnerable segments of any society. While rising digital literacy is a positive sign of an economy’s progress, its potential risks, particularly on children, cannot be overlooked. Given their developmental stage, children are especially vulnerable to online abuse, exploitation, and the broader harms that are associated with cybercrimes. Hence, ensuring a safe digital environment for children is not merely a policy need but a legal and societal imperative. Recognising this as a need of the hour, the UAE enacted the Federal Decree-Law No. 26 of 2025 on Child Digital Safety. This landmark legislation establishes a comprehensive legal framework aimed at safeguarding children in digital spaces, by imposing stringent obligations on digital platforms and internet service providers operating within the UAE.
Table of Contents
Scope of Application of the 2025 Law
The Law applies more broadly to ensure a safe digital environment for children. According to Article 3, the Law explicitly identifies three primary categories of persons and entities:
- All the Internet Service Providers (ISPs), i.e., any person licensed to provide users with access to the information network,
- Digital Platforms, including legal or physical persons, who can be from either the public or private sector, whenever their operations involve a child’s use of the platform or exposure to its content or services. These digital platforms can be different websites, smart applications, messaging apps, electronic gaming platforms, social media platforms, live streaming platforms, e-commerce platforms etc.
- Child Caregivers, who are defined as a person legally responsible for a child, such as a parent or guardian, or anyone entrusted by law with the child’s care and safety decisions.
It is to be noted that the law applies to digital platforms of entities operating within the State, i.e., within the UAE and also to the entities targeting users in the State, regardless of where the entity is physically or legally based.
Who is a Child as under the 2025 Law?
The law defines a “Child” as any person below the age of 18 years. However, more stringent prohibitions regarding the collection, processing, or sharing of any personal data shall apply specifically to children under the age of 13.
The Establishment of the Child Digital Safety Council
The Decree law, under its Article 4, sets out to establish a Child Digital Safety Council to serve as the central coordinating body for ensuring child digital safety, which shall be chaired by the Minister of Family. The primary aim of its establishment is to achieve coordination between various ministries, federal and local entities, and relevant private sector organisations regarding digital safety for children.
Further, Article 5 enlists the core competencies and responsibilities of the council, including but not limited to:
- Proposing the strategic direction, national initiatives, and policies related to child digital safety to the Cabinet.
- Reviewing legislation to ensure that it remains updated with technological developments, emerging risks, and international best standards and practices.
- Proposing frameworks for awareness programs that promote a culture of safe digital use.
- Evaluating the effectiveness of existing policies, strategies, and legislations, and thereby submitting periodic reports.
- Proposing general standards for digital privacy and security, guidelines for platform use, and also supporting the development of various technical tools to protect children from harmful content.
Platform Classification System
For undertaking governance standards, the Cabinet shall also issue a system called the Digital Platforms Classification System to classify digital platforms based on risk assessment. This means that a platform’s specific classification determines the level of technical and legal obligations it shall meet. For this, the key components and functions of this system include:
- Platforms will be categorised based on their type, content, level of usage, impact, and other relevant regulatory standards.
- The system shall define controls for different age groups, the required age verification mechanisms etc.
- The system shall also outline the procedures and mechanisms used by authorities to verify that digital platforms are meeting the obligations assigned to their classification.
Privacy and Personal Data Protection
Taking into consideration the provisions of the applicable legislations on personal data protection (Federal Law no 45 of 2021 on Data Protection and other free zone legislations), digital platforms operating within the State or targeting users in the State shall be prohibited from collecting, processing, publishing, or sharing personal data of children under the age of (13) thirteen except after fulfilling and verifying the following:
- Consent: While collecting, processing, publishing, or sharing the personal data of children under the age of 13, platforms must obtain explicit, documented, and verifiable consent from the child’s caregiver. There must also exist an easy mechanism for the caregiver to withdraw this consent without negative consequences for the child.
- Disclosure: The platform shall clearly disclose data privacy policy and the specific purpose for collecting the data to both the child and their caregiver in a clear manner.
- Restricted access and use: Access to the child’s personal data must be limited only to authorised persons, and the data cannot be used for commercial purposes.
Obligations for Digital Platforms
Primarily, digital platforms are required to comply with the obligations based on the Classification System, as mentioned earlier.
- Article 8 mandates that digital platforms shall adopt effective and reasonable age verification mechanisms.
- Further, Article 9 provides that such Platforms are strictly prohibited from allowing children to access online commercial games, particularly those including activities of gambling, wagering, or betting.
- Article 10 enlists enhanced protection measures that platforms shall undertake. This includes:
- Default privacy settings of these platforms shall offer the highest level of protection for child accounts.
- They shall offer technical controls such as content filtering, age classification, and tools to disable features that encourage excessive interaction that can cause any harm.
- For ensuring parental controls, they shall entail tools allowing caregivers to set daily usage limits and mandatory break times.
- These platforms shall also have proactive detection measures using AI and machine learning to detect, remove, or report harmful content immediately.
Obligations for Internet Service Providers
The specific obligations of Internet Service Providers (ISPs) are given by the Telecommunications and Digital Government Regulatory Authority (TDRA). According to Article 11, the ISPs shall:
- activate content filtering systems at the network level to block harmful content,
- Provide guidance tools that enable caregivers to monitor the digital content accessed by their children.
- Immediately report any child pornography or harmful content circulated on the network, and provide information and data regarding the individuals or entities involved in such content.
- Implement such other measures to ensure safe use for child users.
Responsibilities of Caregivers
As under Article 13, the Caregivers are mandated to regularly monitor digital activities and use parental control tools, refrain from creating accounts for children on platforms that do not match their age group and protect children from any negative exploitation on digital platforms. They shall also educate their children on the health risks of excessive digital use.
Penalties
Platforms that do not comply with the Law may face partial or full blocking, closure, or other administrative fines as defined by the legal framework. For any violations that may constitute criminal offences, such as child pornography, harmful content, or digital abuse or exploitation, entities can also be subject to public prosecution.
The 2025 Decree Law marks a significant step in ensuring the safety of children in the digital space. With provisions for safeguards, awareness initiatives, and a coordinated governance framework, it underscores the UAE’s commitment to fostering a secure and responsible digital environment aligned with evolving global standards.
By entering the email address you agree to our Privacy Policy.



