In just two years, online platforms have reversed almost 50 million decisions affecting users’ content or accounts, helping users exercise their Digital Services Act (DSA) rights online in the EU.
Text "Digital Services Act" inside a white triangle inside a white triangle against a purple background.
With the DSA, users in the EU are more empowered online, online platforms face greater accountability, and the online environment is more transparent.
This instrument, the first of its kind in the world, gave users the right to challenge platforms’ content moderation decisions that affect, suspend, delete or ‘shadow ban’ their content or accounts. Since its application, 30% of 165 million content moderation decisions that users appealed through the platforms’ internal mechanisms have been reversed.
Notably, in the first half of 2025, 99% of content moderation decisions were taken by platforms to enforce their own terms and conditions, rather than to remove content reported as illegal under EU or national law.
In the first half of 2025, out-of-court settlement bodies reviewed over 1,800 disputes related to content on Facebook, Instagram and TikTok in the EU, overturning the platforms’ decisions in 52% of the closed cases – restoring content and accounts in a faster and cheaper way than going to court.
The DSA has also driven concrete changes in user safety and wellbeing. Targeted advertisements to minors on online platforms are prohibited since 2024 in the EU, thanks to this legislation. The DSA also obliges online marketplaces to counter the spread of illegal goods, improve the traceability of traders, and quickly inform customers who purchased any illegal product on their marketplace, offering options for redress.
An additional merit of this legislation is that researchers and civil society have unprecedented access to information on platforms’ processes and content moderation practices in the EU. Furthermore, they can hold platforms accountable for their decisions.
Find out more here.
Image source