The United Arab Emirates has rolled out a strict new law to protect children from online dangers and make sure they’re using safe, age-appropriate digital content. The move is part of a wider effort to prioritize child safety as the nation gears up for 2026, which will be the Year of the Family.
Announced on Friday, the federal decree is designed to shield children from harmful digital content and practices that could negatively impact their physical, emotional, or moral development. The law sets out clear responsibilities for authorities, digital platforms, and caregivers, creating a strong framework for child protection online.
The law targets internet service providers and digital platforms operating in the UAE or catering to users there. That includes everything from websites and search engines to messaging apps, forums, online games, social media, live streaming, podcasts, video-on-demand services, and e-commerce platforms.
Caregivers aren’t left out either—the law gives them specific obligations to keep children safe online.
A National Child Digital Safety Council, chaired by the Minister of Family, will be created to oversee these protections. The council will propose policies and strategies to keep children as safe as possible on digital platforms.
One of the standout features is a national system for classifying digital platforms, ranking them based on risks and influence. Platforms will be evaluated on their content, type, user numbers, and overall impact, with age limits and controls enforced wherever needed.
Digital platforms now face tough restrictions—they cannot collect, use, publish, or share personal data from children under 13 unless certain conditions are met. Platforms for educational or health purposes are exempt, though.
The law also spells out serious obligations for digital platforms and service providers. These include activating default privacy settings, putting age-verification and content-filtering tools in place, using age-rating systems, and following strict rules for targeted advertising.
Children are explicitly barred from accessing commercial online games that include gambling or betting. Internet service providers must install filtering systems to block harmful content, and guardians are required to sign off on using parental controls as a part of their service agreements.
Caregivers must actively monitor children’s digital activities, use parental control technology, and avoid creating accounts for children on platforms that aren’t age-appropriate or don’t meet child protection standards.
There’s also a clear process for reporting harmful content, with fast action required when it comes to abuse or exploitation.
On top of all this, four new initiatives have been launched to boost children’s digital safety. These

