Listen to the article
UK to Combat Foreign Disinformation Through Sanctions and Intelligence Action
Foreign disinformation campaigns targeting the United Kingdom will be countered through a multi-pronged approach including sanctions, public attribution, and intelligence-led operations against state-linked networks, according to a senior government minister.
Speaking before a parliamentary inquiry on Tuesday, Foreign, Commonwealth and Development Office (FCDO) minister Stephen Doughty outlined the government’s strategy, emphasizing that digital platforms are “just the medium” for these influence operations rather than their source.
The statement came as lawmakers pressed for the UK to adopt an approach more closely aligned with the European Union’s Digital Services Act (DSA), which places enforcement responsibilities on technology platforms to combat disinformation. Doughty’s comments suggest the UK is instead focusing on targeting the originators of disinformation rather than simply the channels through which it spreads.
This approach marks a significant policy distinction from the EU, which has implemented strict regulations requiring tech companies to police content on their platforms or face substantial penalties. The UK appears to be taking a more geopolitical view of the problem, treating disinformation as a matter of national security rather than primarily a content moderation issue.
Security experts have long warned about the increasing sophistication of foreign influence operations targeting Western democracies. Russia, China, and Iran have been identified by intelligence agencies as primary sources of coordinated disinformation campaigns designed to polarize societies, undermine trust in institutions, and influence electoral outcomes.
The UK’s strategy reflects growing recognition that disinformation represents a significant threat to national security and democratic processes. By focusing on sanctions and attribution, the government is signaling a desire to impose costs on state actors who engage in information warfare.
Attribution—publicly identifying the source of disinformation campaigns—serves both as a deterrent and as a way to inoculate the public against manipulation by revealing the tactics and intentions behind foreign influence operations. This approach has been used previously when the UK government formally attributed cyberattacks and disinformation efforts to state actors such as Russia’s GRU intelligence agency.
The intelligence-led action mentioned by Doughty likely refers to covert operations to disrupt disinformation networks before they can widely disseminate false narratives. This could include cybersecurity operations, counterintelligence work, and coordination with allies to identify and neutralize threats.
The UK’s stance comes amid increasing global concern about the impact of disinformation on democratic processes. Recent elections in various countries have faced unprecedented levels of foreign interference, with social media platforms serving as primary vectors for the spread of misleading content.
While the UK’s approach differs from the EU’s platform-centric regulations, it remains to be seen how effective either strategy will be in combating the evolving threat of disinformation. Critics argue that a comprehensive approach should include both platform regulation and direct action against state actors.
The parliamentary inquiry is part of a broader effort to develop a coherent national strategy against foreign interference. As disinformation techniques grow more sophisticated, employing artificial intelligence and deepfake technology, governments worldwide are struggling to find effective countermeasures that balance security concerns with civil liberties.
Industry observers note that the UK’s post-Brexit regulatory environment allows for a more independent approach to digital governance, though coordination with international partners remains crucial for effective response to transnational threats.
As the inquiry continues, policymakers will need to address questions about the scope of intelligence operations, criteria for sanctions, and mechanisms for international cooperation in the ongoing battle against state-sponsored disinformation campaigns.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

10 Comments
Interesting to see the UK government taking a more targeted approach to combating disinformation, focusing on the source rather than just platform regulation. I wonder how effective sanctions and intelligence operations will be against state-backed propaganda efforts.
The UK’s strategic shift away from platform regulation and toward disrupting disinformation networks is a bold move. I’m curious to see if their sanctions and intelligence-led approach can meaningfully counter state-sponsored propaganda efforts. It’s a high-stakes gambit.
Targeting the sources of disinformation through sanctions and intelligence operations is an aggressive approach from the UK. It suggests they believe this is more impactful than just regulating tech platforms. Will be fascinating to see how successful this strategy proves to be.
The UK’s divergence from the EU’s platform-centric approach to combating disinformation is notable. Focusing on the sources rather than just the channels is a potentially more effective but also riskier strategy. I’m curious to see how it unfolds.
The UK is taking a proactive stance against foreign disinformation, but avoiding platform regulation is a bold choice. I wonder if this more direct strategy will prove more effective than the EU’s efforts to make tech companies police their own content.
This seems like a pragmatic approach from the UK government – going after the sources of disinformation rather than just the channels. But it will be interesting to see if they can successfully identify and disrupt state-backed propaganda networks.
Sanctions and intelligence ops against disinformation sources is an intriguing policy direction for the UK. It suggests a belief that targeting the originators is more impactful than just regulating the platforms. Time will tell if this strategy pays off.
The UK’s sanctions-focused strategy is a noteworthy divergence from the EU’s platform-centric Digital Services Act. It suggests a belief that disrupting disinformation networks is more impactful than just policing content. But the real test will be in the execution.
I like the UK’s proactive stance in going after the roots of disinformation rather than just moderating content on platforms. But disrupting state-backed propaganda operations is an immense challenge. This will be an interesting policy experiment to watch unfold.
While the UK’s approach seems more aggressive than the EU’s, I’m curious to see if it can meaningfully disrupt state-sponsored disinformation campaigns. Sanctions and intelligence ops are powerful tools, but countering modern propaganda is a daunting challenge.