Listen to the article
EU’s Disinformation Code Now Enforceable Under Digital Services Act
Europe’s fight against online disinformation has entered a new phase as the Code of Conduct on Disinformation becomes officially enforceable on July 1, 2025. What began as a voluntary framework has evolved into a key compliance mechanism under the Digital Services Act (DSA), placing significant new obligations on major tech platforms operating in the European Union.
The transformed Code now requires Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to meet enhanced transparency and auditing requirements designed to combat disinformation. For these tech giants, adherence to the Code has become a critical risk-mitigation measure and evidence of DSA compliance. During mandatory audits, platforms must demonstrate their commitment to these standards or face potential regulatory action from Brussels.
This regulatory milestone arrives at a delicate moment in EU-US relations, with high-stakes trade talks approaching a July 9 deadline. Despite mounting pressure, EU officials remain resolute. “The DSA and the DMA are not on the table in trade negotiations,” a Commission spokesperson stated on Monday. “We are not going to adjust the implementation of our legislation based on actions of third countries. If we started down that road, we’d have to do it with many countries.”
Similar tensions are playing out beyond Europe. Canada recently faced significant blowback from the United States after introducing a digital services tax targeting American tech companies. Former President Donald Trump criticized the move as “obviously copying the European Union,” while Meta executive Joel Kaplan praised Trump for “standing up for American tech companies in the face of unprecedented attacks from other governments.” The dispute resulted in suspended trade talks until Canada withdrew its digital tax.
As Brussels intensifies enforcement to hold platforms accountable, EU regulators find themselves increasingly defending against accusations of censorship, particularly from Washington where MAGA-aligned officials are monitoring developments closely—with vocal support from tech platforms themselves.
Freedom of Expression Concerns
A central debate surrounding the Code’s integration into the DSA framework concerns its impact on freedom of expression. While the European Commission maintains the Code is voluntary, its transformation into a compliance tool under Article 35 of the DSA means failing to meet commitments could trigger investigations or financial penalties.
In May, U.S. Representative Jim Jordan and four colleagues sent a letter to EU Commissioner Michael McGrath expressing concern that the DSA’s requirements could establish de facto global censorship standards. They argued that since most platforms won’t develop separate content moderation systems for Europe and other regions, the rules could effectively restrict Americans’ online speech.
The Commission strongly rejects this characterization. “The [Disinformation] Code is not about censorship,” explained Thomas Regnier, a Commission spokesperson. “On the contrary, it is a framework aiming to create a transparent, fair and safe online environment, while fully upholding the fundamental rights of users, including freedom of expression.”
EU officials emphasize that the DSA’s approach is structural rather than content-specific. Instead of targeting individual posts, the legislation focuses on systemic issues—addressing opaque recommendation algorithms and advertising networks that shape what users see online.
“The Code of Practice on Disinformation is not geared toward content removal,” Regnier noted. “Its commitments aim to protect against manipulation of online platforms, giving users more context and tools to navigate safely—not suppressing content.”
Clare Melford, CEO of the Global Disinformation Index (GDI), argues that framing these regulations as censorship misrepresents their purpose. “Trying to say governments are censoring is a fundamental misunderstanding of how technology works,” she said. “The speech that is actually being suppressed is moderate speech, because it’s less profitable.”
The Audit Challenge
The Code of Practice on Disinformation originated as a voluntary initiative in 2018 before being strengthened in 2022. It now serves as a template for risk mitigation obligations under Article 34 of the DSA. While signing the Code remains optional, platforms must meet comparable standards, and failure to do so can impact their DSA compliance assessments.
“Compliance with the Code is voluntary. Compliance with the DSA is not,” a Commission spokesperson clarified.
Under the DSA, designated VLOPs must undergo annual independent audits, which assess whether disinformation risks have been adequately addressed. The Code’s commitments serve as benchmarks for this evaluation.
Civil society organizations warn that the effectiveness of these measures depends on robust audit protocols. “Without a clear audit framework and access to meaningful data, these audits won’t be credible,” said Melford. “The real risk isn’t censorship—it’s that auditors won’t know what to look for.”
Paula Gori from the European Digital Media Observatory (EDMO) shares similar concerns, noting that some platforms have already withdrawn from their Code commitments. A recent EDMO report highlighted “consistent gaps in transparency, independent oversight and measurable outcomes,” warning that without stronger compliance, the Code “risks remaining performative.”
“Enforcers and auditors will need to do intense work to keep all these different layers together,” Gori noted, calling for standardized risk assessment methodologies and comparable reporting structures across platforms.
As Brussels strengthens its digital regulations amid tense transatlantic trade discussions, the Commission insists these rules are non-negotiable—even as Washington voices concerns about regulatory overreach. What remains uncertain is whether platforms will implement meaningful reforms, and if Commission audits will have sufficient clarity and authority to ensure accountability.
As Melford puts it: “The Code has the potential to work. But only if it’s backed by transparent data, credible audits, and a Commission willing to follow through.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


26 Comments
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
Interesting update on EU Disinformation Rules Launch as Critics Raise Censorship and Trade Concerns. Curious how the grades will trend next quarter.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
I like the balance sheet here—less leverage than peers.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.