The European Commission's Special Panel on child safety online convened for its second session in Brussels on Wednesday, deepening the bloc's effort to establish a harmonised legal framework for protecting minors on social media platforms.
The day's proceedings focused on current EU rules and initiatives to protect minors online, drawing on input from experts advising member states, as well as international models – including Australia's minimum age requirement for social media access.
The session came one day after Commission President Ursula von der Leyen unveiled the EU's age verification app, a tool Brussels is positioning as its flagship technical response to concerns about children's digital safety. As of 15 April 2026, the app is technically ready for implementation and will be made available to citizens shortly.
The system uses "zero-knowledge proof" technology, meaning users can confirm they are above a certain age without disclosing any other personal data to platforms, and their activity cannot be tracked. The app is open source, allowing not only member states but also global partners and private companies to adopt it.
The app is modelled in part on the digital certificate system deployed during the Covid-19 pandemic and is designed to be set up using a passport or national identity card. Seven member states — France, Denmark, Greece, Italy, Spain, Cyprus and Ireland — have been identified as frontrunners and are planning to integrate it into their national digital wallets.
The expert panel, co-chaired by Maria Melchior, Director of the French National Institute of Health and Medical Research (INSERM), and Professor Dr Jörg M. Fegert, Director of the Department for Child and Adolescent Psychiatry and Psychotherapy at Ulm University Medical Centre, is one of the central instruments through which von der Leyen intends to build an evidence base for future EU legislation.
The panel is tasked with exploring a harmonised EU-wide age limit for social media, protections tailored by age and risk level, and educational measures to promote safer use of digital platforms. The perspectives of young people, parents and educators—gathered through the President's Youth Advisory Board and the Safer Internet Forum 2025—will inform the discussions.
By summer 2026, the co-chairs will report to the president with findings and recommendations, with a view to better protecting and empowering children online and to exploring the need for potential harmonised age restrictions to access social media and other online services.
The panel does not operate with permanent membership; the specialists invited may vary from meeting to meeting, allowing the Commission to draw on the most relevant expertise for each topic under review.

The first meeting, held on 5 March 2026, assessed the current evidence on the risks and benefits of children's use of social media and other online activities. It also examined existing approaches to online safety within the EU and in third countries, with discussions centring on platform responsibility, age-appropriate safety-by-design features, algorithmic amplification, and addictive product design.
Speaking at the launch of the age verification app on Tuesday, von der Leyen left little ambiguity about her ambitions. "We must protect our children in the online world, just as we do in the offline world. And for that, we need a harmonised European approach," she said, adding that she expected all member states to begin customising the app for national use.
She also said she had "zero tolerance for companies that do not respect our children's rights" and that children's rights in the EU come before commercial interest.
The Commission's push sits alongside, and aims to supersede, a patchwork of national measures. France's Senate has voted to ban minors from accessing social media, and Spain's Prime Minister Pedro Sánchez has announced a ban on social media access for children under 16.
Brussels is now seeking to replace this legislative fragmentation with a single European standard — an ambition that will require agreement among 27 member states with divergent legal traditions and political appetites for platform regulation.
The initiative draws its enforcement infrastructure from the Digital Services Act (DSA), which already places obligations on technology platforms to protect minors online.
The European Board for Digital Services convened in Brussels for its 18th meeting on 15 April 2026, where it reaffirmed its commitment to the protection of minors online and discussed ways to streamline its enforcement activities.
The panel's summer report is expected to prove politically consequential, potentially forming the basis for binding legislative proposals that would affect every major social media company operating in the European single market.