Listen to the article

0:00
0:00

Elon Musk’s X Platform Dominates EU Disinformation Landscape, Report Finds

Nearly 90 percent of disinformation content analyzed by the European External Action Service (EEAS) in 2025 was distributed through Elon Musk’s X platform, according to a comprehensive new report on foreign interference in Europe’s information space.

The EEAS study, titled “Threats of external interference and information manipulation,” examined approximately 43,000 pieces of disinformation content, finding that X accounted for 88 percent of the material, dramatically outpacing other platforms like Telegram (3 percent) and Facebook (2 percent).

Analysts attribute X’s dominance in the disinformation landscape to “the presence of networks of coordinated inauthentic behavior, the ease of creating fake accounts, and easier access to data.” The report notes that most major social media platforms restrict access to data that would allow for comprehensive assessment of information manipulation, making X’s more open structure particularly vulnerable.

The EEAS found that disinformation campaigns typically operate across multiple platforms simultaneously, with actors maintaining different accounts across social networks while also disseminating content through messaging apps like WhatsApp and Telegram. “The aim is to infiltrate the information space to increase the visibility and credibility of the content, while targeting specific audiences based on socio-demographic and geographical factors,” the report states.

Artificial intelligence has emerged as a critical factor in the evolving threat landscape, with AI use in disinformation campaigns against the EU surging 259 percent compared to 2024. “Russian and Chinese actors have fully implemented AI tools to speed up content production and increase meddling activities with fewer resources,” the report warns. European officials noted that AI technology is significantly reducing the cost of conducting sophisticated disinformation operations.

Political figures have been the primary targets, accounting for 66 percent of attacks. Ukrainian President Volodymyr Zelensky, French President Emmanuel Macron, German Chancellor Friedrich Merz, and European Commission President Ursula von der Leyen were frequently targeted. The EEAS noted that these campaigns typically attack “what an individual stands for” while attempting to “instrumentalize the platform they have to reach specific audiences.”

Among organizations, political entities faced the most attacks (36 percent), followed by media outlets (23 percent) and military or security organizations (22 percent). The report explains that disinformation actors specifically targeted defense and media sectors as “crucial to democracy,” directing “derogatory narratives, attempts at impersonation and direct smear campaigns” at these institutions.

Election periods and public demonstrations proved particularly attractive for disinformation campaigns, with actors exploiting “moments of high tension and emotional charge” as vulnerabilities to “reach their target audiences, influence their thinking and amplify existing cognitive prejudices.”

The report reveals a significant strategic shift in Russian disinformation efforts, which have increasingly focused on the European Union rather than the United States. Of the 540 incidents analyzed in 2025, approximately 29 percent were attributed to Russia, followed by China at 6 percent. The remaining 65 percent could not be definitively attributed but showed indicators linked to Russian or Chinese infrastructure.

“Russia’s main target was Ukraine, followed by Moldova, where parliamentary elections were held in September 2025,” the report notes. Moscow’s operations aim to “fuel new or deepen existing divisions” in societies by mobilizing anti-system sentiment and undermining confidence in the EU by portraying it as either “undemocratic and aggressive or too weak.”

Chinese disinformation activities display their own characteristics, combining conspiracy theories with efforts to “suppress narratives that go against some of their fundamental interests.” The report describes “aggressive measures such as intimidation and harassment of critical voices to suppress information even outside their borders.” China consistently portrays the EU as “subservient to the United States” in terms of foreign policy.

The financial investment in these operations is substantial. The EEAS estimates Russia allocates approximately 1.6 billion euros for disinformation campaigns, while China’s spending ranges between 6 and 8.6 billion euros.

While Russia and China typically conduct separate operations, the report identifies instances of coordination, primarily described as “opportunistic” collaboration. One notable example occurred in October 2025, when Chinese disinformation campaigns supported Russian actions during incidents involving Russian drones flying over European airspace.

The EEAS cautions that the report should not be considered “exhaustive,” as its monitoring does not cover all regions and languages, representing “only a small portion of the activities of these actors.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

9 Comments

  1. Mary K. Brown on

    The report’s emphasis on the need for greater data transparency from social media companies is well-founded. Robust independent analysis is crucial for understanding and mitigating information manipulation.

  2. Robert Rodriguez on

    This is certainly a concerning development. While free speech is important, disinformation can undermine democratic processes. Platforms need to find the right balance between openness and content moderation.

  3. Isabella Lee on

    The dominance of a single platform in the disinformation landscape is troubling. Coordinated networks of fake accounts amplifying misinformation are a major challenge for maintaining an informed public.

  4. Michael Hernandez on

    I’m curious to learn more about the specific tactics and strategies these disinformation campaigns use to spread across multiple platforms. Understanding the mechanics is key to developing effective counter-measures.

  5. Ava Williams on

    The high percentage of disinformation content on a single platform is worrying. Strict content moderation and authentication protocols should be required to mitigate the spread of false narratives.

  6. Noah Martinez on

    The report highlights the need for greater transparency and data access from social media companies to allow comprehensive analysis of information manipulation. Policymakers should press for these reforms.

  7. Amelia Z. Lee on

    This report underscores the ongoing threat of foreign influence operations targeting democratic institutions. Maintaining a healthy information environment is critical for safeguarding our political processes.

  8. Elizabeth Lopez on

    While free speech is a fundamental right, the scale of disinformation spread through this platform is deeply concerning. Policymakers must find ways to balance openness with responsible content governance.

  9. This is an important reminder that the digital information ecosystem remains highly vulnerable to exploitation. Vigilance and collaborative efforts between tech, governments, and civil society are crucial.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.