First Meeting of The Special Panel on Child Safety Online

The European Commission launched the first Panel on Child Safety Online to explore potential age restrictions for social media and develop comprehensive digital protections for minors across the European Union.


By Creatives Unite Newsroom
March 10, 2026
You can download this article in PDF format here!
Find out more here:

"For decades, we have made the real world safer for children, and we must do the same in the digital world. The positive opportunities that technology offers cannot come at the cost of their safety, health or happiness." European Commission President Ursula von der Leyen framed the issue in strong terms after the first inaugural meeting of the Special Panel on Child Safety, held online. The group was convened to advise on stronger protections for minors and explore potential EU-wide age restrictions for social media, with policy recommendations expected by mid-year.

In her annual State of the Union speech—delivered on September 10, 2025—von der Leyen said she was considering how children in the EU could be restricted from using social media, drawing attention to Australia's approach, where teenagers under the age of 16 cannot use platforms like Snapchat, TikTok, Facebook, Instagram, and X. She stated, "I will commission a panel of experts to advise me by the end of this year on the best approach for Europe."

The group unites experts from various disciplines, including children's rights, health, and IT. More specifically, the panel brings together specialists in health, neuroscience, psychology, computer science, child rights, and digital literacy from across the EU, alongside youth representatives.

Two prominent academics co-chair the panel. Maria Melchior serves as director of France's National Institute of Health and Medical Research (INSERM), while Professor Jörg M. Fegert is director of the Department of Child and Adolescent Psychiatry and Psychotherapy at Ulm University Medical Centre. Fegert is also a board member of the European Society for Child and Adolescent Psychiatry, chairman of the Academic Advisory Board on Family Affairs of the German Federal Ministry for Family Affairs, and a member of the National Council Against Sexual Violence against Children and Adolescents.

Additionally, COFACE Families Europe president Antonia Torrens has been invited to join the panel to advise on child safety online and potential age limits for social media platforms.

Discussions during the first meeting centred on platform responsibility, including age-appropriate safety-by-design features, algorithmic amplification, and addictive product design. The agenda also addressed digital literacy for children, parents, and educators, while considering how regulatory measures can reduce risks without undermining the benefits of online participation.

The European Commission also signalled an intent to advance child safety online in collaboration with Australia's eSafety Commissioner and the UK's Ofcom across several areas.

No specific policy recommendations were formally issued from the first meeting. The session was focused on evidence review and initial deliberations, with the panel's full report and recommendations reserved for summer 2026.

The Special Panel is tasked to explore key issues for the online safety of children, such as a harmonised EU age limit for social media across the EU, tailored protections based on age and risk, and educational measures to promote responsible social media use.

The panel will also draw on feedback from the Citizens' Panels and the President's Youth Advisory Board, which submitted their views to the President in January 2026.

The second meeting is expected to take place in the coming months and will explore possible policy options. The panel aims to present a report with recommendations to the Commission President by summer 2026.


In 2025, the EU Commission released a prototype of an EU-wide age verification app, which allows users to prove they are over 18 when accessing restricted adult content. It is currently being piloted by Denmark, France, Greece, Italy, and Spain. The guidelines distinguish between three age assurance approaches: self-declaration (considered unreliable), age estimation using tools such as facial analysis, and age verification using official IDs or trusted digital credentials such as the upcoming EU Digital Identity Wallet, expected in 2026.

The Commission has adopted an action plan against cyberbullying, noting that 1 in 6 adolescents have experienced cyberbullying and 1 in 8 admit to having participated in it, making it the main topic of calls to Safer Internet Centre helplines over the last five years.

During 2025, the Commission opened formal proceedings against pornography providers related to age verification failures and also sent information requests to Snapchat, YouTube, the Apple App Store, and Google Play to understand the measures these companies have in place to protect minors.

A key political subtext of the panel's work is the fragmentation of child online safety rules across Europe. To date, there is no outright ban at the EU level on children accessing social media, while some member states — including Denmark, Greece, France, Spain, Italy, Ireland, and Poland — have been discussing the option of a social media ban for children.

France, Germany, Italy, and Ireland have all put in place legally binding rules mandating age assurance to ensure minors do not access certain content, though these national measures all have limitations, hence recent calls for action at the EU level. Under EU rules, individual member states cannot unilaterally impose additional obligations on large platforms — such as age controls — as this is the sole competence of the European Commission.

The next major legislative step expected is the Digital Fairness Act, through which MEPs are pushing to transform today's non-binding guidance into binding rules, particularly on dark patterns, algorithmic rabbit holes, and the role of influencers in targeting minors. Commission officials have signalled that "the protection of minors will be a transversal and key element" of that act.

The Special Panel's summer 2026 recommendations could prove decisive in determining whether the EU moves toward a harmonised minimum age for social media access — and whether that age mirrors Australia's under-16 threshold or takes a different form tailored to the European rights framework.


Image Credit: : Digital Mom Blog, CC BY 4.0
Image 2: Courtesy of the European Commission
*An AI LLM has been used for the production of this article, which has been curated, fact-checked and edited by the Creatives Unite editorial team.