The first harmonised submissions of content moderation reports from very large online platforms were due at the end of February 2026, covering the period from July 2025 onwards. These reports will provide civil society and public authorities with the transparency needed to exercise accountability.
By Creatives Unite NewsroomLast week marked the deadline for publication of the first round of harmonised transparency reports by providers of intermediary services under the Digital Services Act (DSA). The European Commission introduced standardised reporting rules that compel the largest online platforms to disclose, in comparable and machine-readable form, how they police content across their services.
The requirements stem from a Commission Implementing Regulation under the Digital Services Act (DSA). The regulation was adopted on 4 November 2024, with the mandatory standardised templates taking effect on 1 July 2025. The regulation mandates a uniform template — based on comma-separated value (CSV) files — that providers of intermediary services must use when publishing transparency reports.
Previously, platforms filed disclosures in incompatible formats, with inconsistent labels and categories, making meaningful cross-platform analysis effectively impossible. Under the new framework, standard providers must report annually. Very large online platforms (VLOPs) and very large online search engines (VLOSEs) — those with more than 45 million average monthly users in the EU — face a more demanding schedule: twice-yearly reports covering January to June, due by the end of August, and July to December, due by the end of February. The first harmonised submissions under these rules fell due at the end of February 2026, covering the period with data collection beginning in July 2025. Those reports are now publicly accessible via the European Commission’s DSA transparency overview page.
The template requires both quantitative and qualitative data, broken down by content category, EU member state, and calendar month. Quantitative sections cover orders received from member state authorities — including removal orders and orders to provide user information — as well as median response times. Platforms must also report on notices submitted by users and by so-called trusted flaggers identifying allegedly illegal material, detailing the actions taken and the time elapsed. A separate section addresses complaints and internal appeals, recording how many were received and whether original moderation decisions were upheld or reversed.
Further mandatory disclosures concern the use of automated moderation tools, including the proportion of decisions made with full or partial automation and associated accuracy indicators. Platforms must also state the number of content moderators employed, distinguishing between in-house staff and those engaged through third parties.
All moderation data must be classified according to fifteen mandatory primary categories. These include animal welfare, cyber violence, scams and fraud, and the protection of minors — a category that encompasses child sexual abuse material and grooming. Platforms may use additional subcategories at their discretion. A final qualitative sheet invites free-text descriptions providing context for the figures, including any changes to the moderation policy during the reporting period.
The European Commission has aligned the new framework with the DSA Transparency Database, which logs individual moderation decisions. Officials say the alignment will allow consistency checks between aggregate reports and underlying decision records, strengthening oversight of platforms' compliance with fundamental rights obligations, including freedom of expression.
Researchers, journalists, and civil society groups are among those expected to benefit from the comparability the new template provides, particularly in tracking trends in areas such as child protection, online fraud, and cyber violence across different services. The Digital Services Act applies to all providers offering services to users in the European Union, regardless of where those providers are established.