Now Reading
UAE Child Digital Safety Law Targets TikTok, Twitch, Snapchat, And More

UAE Child Digital Safety Law Targets TikTok, Twitch, Snapchat, And More

UAE child digital safety rules

Dubai: The UAE has introduced one of the region’s most comprehensive online child protection frameworks. As a result, digital platforms now face stronger obligations to protect young users.

The new Child Digital Safety Law (CDS Law) tightens rules around harmful content, excessive engagement, and the collection of children’s personal data. Additionally, it applies to both local companies and foreign platforms that target users in the UAE, even without a physical presence in the country.

Global Apps Face Wider Responsibilities

The CDS Law covers a broad range of digital services accessible to children in the UAE. Therefore, major social media apps, messaging platforms, streaming services, online games, and e-commerce websites all fall under the new requirements.

Moreover, the law goes beyond content moderation and directly focuses on child data protection. For users under the age of 13, platforms cannot collect or use personal data unless they have explicit, documented, and verifiable parental consent.

Compliance Deadline, Required Changes, And Penalties

Federal Decree-Law No. 26 of 2025 on Child Digital Safety came into force on January 1, 2026. Meanwhile, foreign online platforms have one year to update systems and policies before enforcement measures apply.

The CDS Law also raises compliance expectations by requiring child-focused safeguards instead of generic safety settings. For example, platforms must introduce age-verification mechanisms that match the level of risk on their services.

In addition, child accounts must use default high-privacy settings and include tools that enforce minimum age requirements. Platforms must also apply content filtering, age classification systems, and restrictions on targeted advertising.

See Also
AI powered ecommerce shopping interface

Furthermore, services must add features that reduce excessive screen time and over-engagement. They must also provide clear reporting channels for harmful content, actively detect violations, remove harmful material, and submit periodic reports to relevant authorities.

Failure to comply could bring serious consequences. As a result, non-compliant platforms may face partial or full blocking, closure, and financial penalties.

Finally, children remain highly exposed online because they often share personal information gradually. Therefore, repeated sharing across platforms can increase the risk of exploitation, impersonation, and account compromise over time.

View Comments (0)

Leave a Reply

Your email address will not be published.

© 2024 The Technology Express. All Rights Reserved.