Listen to the article
In a comprehensive analysis of global disinformation campaigns, the European External Action Service (EEAS) has revealed that Elon Musk’s X platform served as the primary conduit for disinformation targeting the European Union throughout 2025, accounting for an overwhelming 88% of the approximately 43,000 pieces of content examined.
The report, titled “Threats of external interference and information manipulation,” found that X far outpaced other platforms in hosting disinformation, with Telegram and Facebook trailing significantly at just 3% and 2% respectively. The EEAS attributes this concentration to several factors, including “networks of coordinated inauthentic behaviour, the ease of creating fake accounts, and easier access to data” on the platform.
According to European officials, disinformation actors typically deploy multi-platform strategies, operating simultaneous accounts across social networks and messaging apps like WhatsApp and Telegram. “The aim is to infiltrate the information space to increase the visibility and credibility of the content, while targeting specific audiences based on socio-demographic and geographical factors,” the report states.
The integration of artificial intelligence has dramatically transformed the disinformation landscape, with AI-driven content increasing by a staggering 259% compared to 2024. “Russian and Chinese actors have fully implemented AI tools to speed up content production and increase meddling activities with fewer resources,” the EEAS notes. One European official emphasized that AI technology has significantly reduced the cost of mounting sophisticated disinformation operations.
The report reveals that politicians bore the brunt of these campaigns, with 66% of attacks targeting public figures. Ukrainian President Volodymyr Zelensky, French President Emmanuel Macron, German Chancellor Friedrich Merz, and European Commission President Ursula von der Leyen featured prominently among those targeted. Rather than focusing on personal attacks, most campaigns sought to undermine “what an individual stands for” and exploit their platform to reach specific audiences.
Among organizations, political entities received 36% of attacks, followed by media outlets (23%) and military or security organizations (22%). The EEAS noted that these sectors were deliberately targeted to “undermine confidence in defense capabilities” and attack institutions “crucial to democracy.”
Election periods proved particularly vulnerable to disinformation campaigns. Threat actors also exploited public demonstrations and civil unrest to “fuel perceptions of chaos, fear and disorder,” typically directing blame toward local administrations. “Moments of high tension and emotional charge are seen by the actors of these threats as vulnerabilities that allow them to reach their target audiences, influence their thinking and amplify existing cognitive prejudices,” the report explains.
Russia remained the dominant source of disinformation against the EU in 2025, responsible for 29% of the 540 incidents analyzed, while China accounted for 6%. The remaining 65% could not be definitively attributed but contained “indicators linked to Russian or Chinese infrastructure.”
In what the EEAS describes as a “strategic recalibration,” Russia shifted its focus more heavily toward the European Union in 2025, reducing its campaigns against the United States. Russian operations primarily targeted Ukraine and Moldova, with the latter facing parliamentary elections in September 2025. Moscow’s strategy centered on deepening societal divisions and undermining confidence in the EU by portraying it as either “undemocratic and aggressive” or “too weak.”
Chinese disinformation efforts displayed distinct characteristics, combining conspiracy theories with efforts to “suppress narratives that go against some of their fundamental interests.” Beijing frequently employed intimidation tactics against critics and worked to depict the EU as “subservient to the United States” in foreign policy matters.
The financial commitment to these operations is substantial. The EEAS estimates that Russia allocates approximately 1.6 billion euros to disinformation campaigns, while China’s expenditure ranges between 6 and 8.6 billion euros.
While Russia and China typically operate independently, the EEAS documented occasional coordination between the two powers. These collaborative efforts were generally “opportunistic,” with each country amplifying the other’s attacks when advantageous. One notable example occurred in October 2025, when China supported Russia with disinformation campaigns during Russian drone flights over European airspace.
The EEAS cautions that its report should not be considered “exhaustive,” as its monitoring does not cover all regions and languages, representing only “a small portion of the activities of these actors.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


18 Comments
While concerning, this report highlights the importance of rigorous fact-checking and media literacy efforts to equip the public with the tools to discern truth from fiction online.
Absolutely. Building resilience against disinformation requires a multi-pronged approach involving platforms, policymakers, and citizens working in tandem.
The disproportionate role of X in this disinformation campaign underscores the platform’s failure to curb the spread of misinformation. Tougher regulatory action is clearly warranted.
Absolutely. X’s lax content moderation policies have allowed it to become a breeding ground for malicious actors. Platforms must be held accountable for the harms they enable.
The report’s findings reinforce the urgent need for coordinated global efforts to counter the proliferation of online disinformation, which poses a threat to democratic processes worldwide.
This report highlights the urgent need for greater collaboration between platforms, policymakers, and civil society to address the complex challenge of online disinformation. A multi-stakeholder approach is crucial.
This is a disturbing development that highlights the fragility of our information ecosystem. Policymakers must act swiftly to rein in platforms that prioritize engagement over truth.
Agreed. Protecting the integrity of the public discourse should be the top priority, even if it means imposing stricter rules on social media companies.
The EU’s findings underscore the need for comprehensive reforms to social media regulation and content moderation practices. Platforms can no longer ignore their role in amplifying harmful disinformation.
Effective regulation is crucial to rein in the deluge of misinformation on social media. Policymakers must act decisively to protect citizens from malicious manipulation.
While the findings are worrying, they also present an opportunity to strengthen the EU’s resilience against foreign interference and information manipulation. Collaborative solutions are needed.
Agreed. This report should spur EU policymakers to work closely with platforms, fact-checkers, and civil society to develop a comprehensive strategy to combat online disinformation.
The dominance of X in this disinformation campaign is a sobering reminder of the platform’s outsized influence and the need for greater transparency and accountability.
Platforms like X cannot be allowed to become conduits for malicious actors seeking to undermine democracy. Robust content moderation and data sharing are essential.
It’s alarming but not surprising that X has become a breeding ground for disinformation. The platform’s lax content moderation policies have long enabled the spread of misinformation.
X’s outsized role in this disinformation campaign underscores the need for greater platform accountability and transparency. Stronger regulations are clearly overdue.
Concerning that X has become a hub for disinformation targeting EU politicians. Platforms need to be more vigilant in detecting and removing coordinated inauthentic behavior to protect the integrity of the information space.
Agreed. Platforms must prioritize transparency and accountability to curb the spread of misinformation that can undermine democratic processes.