The European Commission yesterday published guidlines on safeguarding professional media content from arbitrary removal by tech giants, marking a critical step in implementing landmark press freedom legislation.
The guidelines, released on 5 February, clarify how platforms such as Meta, Google, and X must apply special protections for registered news organisations under Article 18 of the European Media Freedom Act. The rules, which took effect last August, establish what regulators call a "media privilege"—requiring very large online platforms to treat journalism differently from user-generated content.
Under the framework, platforms must provide advance notice before restricting media content through removal, demotion, or reduced visibility. News organisations gain a minimum 24-hour window to contest such decisions and receive detailed justifications—except in cases involving imminent public safety threats such as incitement to violence.
The guidance follows a public consultation last year involving media groups, Journalism associations, regulators, and civil society organisations, aimed at addressing early implementation challenges.
To qualify for protections, media providers must submit declarations through dedicated platform tools, demonstrating editorial independence, regulatory oversight by national authorities or press councils, freedom from undue state influence, and human oversight of AI-generated content. Platforms can consult the European Board for Media Services or national regulators when verifying applications and may engage independent fact-checkers for complex cases.
False declarations carry penalties under the Digital Services Act, which permits fines up to 6 percent of global turnover.
Once registered, media organisations receive priority complaint handling and access to expedited dispute settlement through the Board. Platforms must report annually on compliance and engage in good-faith resolution of disputes, including content reinstatement when moderation proves unjustified.
Press freedom advocates have welcomed the clarity, arguing it strengthens journalism's digital presence amid growing concerns over platform power and disinformation. However, some critics warn of potential loopholes in verification procedures that platforms might exploit.
Tech companies have embraced the guidance as necessary clarification to avoid substantial DSA penalties. The framework forms part of EMFA's broader objectives, adopted in 2024, to insulate media from economic pressures, state interference, and algorithmic bias.
Image Credit: Jana Dobreva for Fine Acts, under a Creative Commons-Attribution-NonCommercial-ShareAlike 4.0 International license (CC-BY-NC-SA)