EU Prepares Summer Law to Bar Children from Social Media as Von der Leyen Vows to 'Give Childhood Back'

Ursula von der Leyen has called for a Europe-wide minimum age for social media, warning that tech companies are deliberately profiting from children's insecurities and that "no tech company in the world should be allowed to treat them as a commodity".


By Creatives Unite Newsroom
May 12, 2026
You can download this article in PDF format here!
Find out more here:

The European Commission president opened a Copenhagen summit on child safety in the digital age on Tuesday with a stark warning that artificial intelligence is accelerating risks to children online and signalled that a formal legislative proposal on minimum social media age could be ready as early as this summer.

Speaking at the conference – held in a country she praised for being among the first, a decade ago, to bring screens and digital technology into schools and which is now taking the lead in confronting the consequences – von der Leyen said the harms children face online were "not accidental" but the deliberate product of commercial choices by technology platforms. "They are the result of business models that treat our children's attention as a commodity," she told the assembled ministers and officials. "The more attention, the higher the profit."

Von der Leyen cited a finding by a Danish children's rights organisation that nearly half of all content children encounter on social media is advertising. She described young men being drawn into games engineered to extract increasing sums of money, and young women targeted with beauty product advertisements the moment they untag themselves from a photograph. "There is a reason why some call it 'the greatest brain hack in human history'," she said, quoting a phrase applied by critics to the social media attention economy — a description she did not dispute.

The documented consequences, she said, include sleep deprivation, depression, anxiety, self-harm, addictive behaviour, cyberbullying, grooming, exploitation and suicide. With the rapid advance of artificial intelligence, those risks were "multiplying fast". "Children are not commodities," she said, "and no tech company in the world should be allowed to treat them as such."

Regulatory Action Already Under Way

The Commission president detailed enforcement action already taken under the Digital Services Act, Europe's primary legislative instrument for regulating online platforms:

TikTok is facing action over addictive design features, including endless scrolling, autoplay and push notifications. Meta is under investigation over allegations that Instagram and Facebook are failing to enforce their own stated minimum age of 13. Proceedings have been launched against X over the use of its Grok AI system in generating and spreading material depicting child sexual abuse.

Platforms that expose children to "rabbit holes" of harmful content — including videos promoting eating disorders or self-harm — are under active investigation.

Von der Leyen also cited the Digital Markets Act as a companion instrument to prevent platforms from abusing market power, noting that enforcement cases had already been closed against Apple and Meta, with an investigation into Google ongoing. "We have set rules; it is the law, and those who break it will be held accountable," she said.

The Case for a Minimum Age

The central proposal von der Leyen endorsed was what she termed a "social media delay" — a statutory minimum age below which children cannot hold accounts. She said a commission-established special panel of experts on child safety online had been asked to advise on the question and that its findings could trigger a legislative proposal by summer 2026.

The political momentum behind such a measure is considerable. Almost all EU member states have called for a formal assessment of the need for a minimum age. Denmark has already moved to introduce one nationally; nine other member states, which the Commission did not name at the conference, are pursuing the same course. The European Parliament has reached the same conclusion.

Von der Leyen drew a pointed distinction in framing the question: "The question is not whether young people should have access to social media; the question is whether social media should have access to young people."

She held up Australia, which introduced a minimum social media age of 16, as a pioneer, though she acknowledged that progress there had been uneven. While some platforms had moved to inform users and deactivate underage accounts, others were "actively encouraging teenagers to bypass these safeguards". Sustainable change, she conceded, does not happen overnight – but delay carries its own cost: "If we are slow and hesitant, it will be another entire generation of children that pay the price."

An Age-Verification Tool Ready to Deploy

The Commission announced that a European age-verification application is ready for rollout, built on the architecture of the EU's Digital COVID Certificate, which was deployed in 78 countries across four continents. The app meets what von der Leyen described as the world's highest privacy standards, works on any device, is easy to use and is fully open-source, enabling platforms to adopt it without technical or commercial barriers. It is scheduled for deployment in Denmark by summer 2026. The Union is working with member states to integrate it into national digital wallets.

"No more excuses," she said. "The technology for age verification is available."

Von der Leyen was unambiguous that an age limit would not relieve platforms of responsibility for the content they host. Drawing on an automotive analogy, she argued that just as car manufacturers bear legal responsibility for fitting seatbelts and airbags — and parents are not expected to do so at home — so technology providers must bear responsibility for the safety of their products from the moment of design, not as an afterthought.

"Safety by design" was the governing principle she invoked. She said it would be strengthened under a forthcoming Digital Fairness Act, expected later in 2026, that targets attention-capture techniques, complex contracts, and subscription traps. Children's rights have also been prioritised in the Commission's AI governance rules.

The burden on parents, she argued, is already too heavy. "The responsibilities on parents are already so high, so let us take this additional weight from their shoulders."

Media Literacy as a Parallel Obligation

While regulation dominated the speech, von der Leyen stressed that children must also be equipped with the critical skills to navigate digital environments autonomously. She called on parents, teachers, media organisations, NGOs and journalists to build those competencies from an early age – including the ability to verify sources, identify AI-generated imagery and resist manipulation. "The principle is to encourage thinking critically before clicking," she said. "Media education is a key task for society as a whole."

Von der Leyen closed with a direct rebuttal to those who regard major technology corporations as beyond democratic control. "We do not have to accept addictive social media designs," she said. "We do not have to accept children being drawn into ever more extreme content. We do not have to accept that girls and women have their photos used for AI-generated sexualised images... It is us who decide our rules, not the tech companies."

"Let us give childhood back to children", she concluded — time to play with real friends rather than chase followers; time on a football pitch or playing in a band; time to develop their own ideas rather than be guided by an algorithm; and time to learn the difference between reality and falsehood.

The full speech transcript is published on the European Commission website


Image: CC-BY-4.0: © European Union 2023– Source: EP
LLMs were used to source and fact-check this story. CU wrote, edited and curated it