Listen to the article
Tech giants and fact-checkers have published their first round of transparency reports under the EU’s Code of Conduct on Disinformation, now formally integrated into the Digital Services Act (DSA) regulatory framework. These reports, available through the Code’s Transparency Centre, detail measures implemented by signatories to combat online disinformation from July through December 2025.
Major platforms including Google, Meta, Microsoft, and TikTok submitted reports alongside fact-checking organizations, researchers, civil society groups, and advertising industry representatives. The European Commission highlighted this as the first complete reporting cycle since the Code’s formal recognition within the DSA structure.
The reports address several critical areas, including responses to the Ukraine conflict and efforts to protect electoral integrity. They also provide implementation data for anti-disinformation measures and outline policy developments under the DSA framework.
This reporting round carries heightened significance following the Code’s regulatory elevation. On February 13, 2025, at the signatories’ request, the European Commission and the European Board for Digital Services formally endorsed the Code as a DSA-recognized code of conduct. By July 1, 2025, it had been fully integrated into the DSA’s co-regulatory framework.
The arrangement represents a substantial shift from the Code’s previous voluntary nature. Signatories must now undergo independent annual auditing of their compliance with commitments. The Commission has positioned the Code as an important benchmark for determining compliance with Article 35 of the DSA, particularly for providers of very large online platforms and search engines.
Reporting requirements vary based on the type of organization. Very large online platforms and search engines—including Google Search, YouTube, Facebook, Instagram, WhatsApp, Bing, LinkedIn, and TikTok—must report semi-annually on actions taken by their services. Non-platform signatories face annual reporting obligations under the DSA structure.
This regulatory evolution reflects the EU’s broader strategy to connect platform self-reporting with formal oversight mechanisms. By embedding the disinformation Code within the DSA framework, European regulators are leveraging voluntary commitments, transparency reporting, and auditing as components of a co-regulatory approach to address systemic online risks.
It’s worth noting that the Commission’s announcement focuses on the publication of these reports rather than evaluating their quality or effectiveness. The reports describe measures, data, and policy developments without assessment of whether the actions were sufficient—an important distinction in politically sensitive areas such as election integrity and crisis-related misinformation.
The EU’s approach demonstrates how the DSA serves not only to impose direct legal obligations on major digital platforms, but also to anchor previously voluntary commitments within a more structured regulatory environment. The effectiveness of this strategy will become clearer through continued reporting, auditing, and review processes, which will determine how much practical weight the Code carries within the DSA framework.
Industry observers note that this marks a significant step in Europe’s digital regulation strategy, potentially establishing a model that other jurisdictions might consider adopting. As disinformation continues to threaten democratic processes and public discourse globally, the EU’s formalized approach to platform accountability could influence international standards for digital governance.
The transparency reports represent just one component of the EU’s comprehensive digital regulatory agenda, which also includes the Digital Markets Act aimed at curbing the power of digital gatekeepers and ensuring fairer competition in online markets.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
The DSA’s regulatory elevation of the Code of Conduct seems like a positive development. Disinformation remains a major challenge, so strengthening the compliance framework is a sensible move. Looking forward to seeing how these initial reports inform future policy refinements.
Glad to see fact-checkers, researchers, and civil society groups involved in these reporting efforts alongside tech platforms. A diverse set of stakeholders will be crucial for developing effective anti-disinformation strategies. Curious to see how the Commission evaluates the first round of submissions.
The integration of the Code of Conduct into the DSA is an important step, but the real test will be in the implementation and enforcement. Curious to see how the Commission oversees compliance and what happens if signatories fall short of expectations.
Protecting electoral integrity is a key priority, so I’m encouraged to see it highlighted in these initial DSA transparency reports. Disinformation poses a serious threat to democratic processes, so it’s important these measures continue to evolve and adapt.
Agreed, electoral integrity is paramount. These reports will provide valuable insights to help policymakers refine the DSA framework and ensure platforms are taking appropriate action against election-related disinformation.
Interesting to see the initial transparency reports under the EU’s Digital Services Act. Glad to see efforts to combat online disinformation, especially around critical events like the Ukraine conflict and elections. Looking forward to seeing how these measures evolve over time.
The integration of the EU’s Code of Conduct on Disinformation into the DSA regulatory framework is an important step. Transparency around anti-disinformation efforts by major tech platforms and other stakeholders will be crucial. Curious to see how they address emerging challenges.
Absolutely, transparency is key. It will be important to monitor how effectively these measures are implemented and their real-world impact on reducing the spread of harmful disinformation.