Listen to the article
Rising Threat of Information Manipulation Demands Stronger EU Response
Information manipulation and interference by both foreign and domestic actors have reached alarming levels across Europe, posing a significant threat to democratic stability. The tactics have evolved dramatically from mere online disinformation to include physical sabotage, creating a multi-dimensional challenge for European security.
Recent cases highlight how sophisticated these operations have become. In Romania’s 2024 presidential election, an illegal AI-based campaign orchestrated by candidate Calin Georgescu deployed fake accounts and paid influencers to spread false narratives. In Hungary, state media has been criticized for information control that favors certain political narratives, while Slovakia faces European Commission warnings over its controversial NGO law requiring organizations receiving foreign funding to register under special designations.
The scope of these threats has expanded beyond digital realms. In Germany, over 270 vehicles were sabotaged in a coordinated operation initially blamed on climate activists but later linked to foreign intelligence operations. These physical acts of interference represent a dangerous escalation of tactics designed to destabilize European societies.
“Digital platforms have fostered echo chambers that isolate and amplify anti-democratic views,” explains a recent analysis. “Yet platform responses remain inadequate.” Meta, for instance, has reduced its fact-checking teams, while announcing plans to ban political ads in the EU from October 2025, citing legal complexities under the new TTPA regulation.
AI Amplifies Disinformation Threat
Artificial intelligence has dramatically increased the efficiency of disinformation campaigns. A 2024 study revealed that 86% of people worldwide had encountered fake news, with AI-generated content often perceived as more trustworthy than human-created falsehoods.
During election campaigns across Central and Eastern Europe, AI-generated materials have flooded social media. Hungary’s Jobbik Party has used AI to create negative portrayals of immigrants, while in Romania, AI was deployed to fabricate compromising videos of candidates making false statements.
Current detection tools struggle to keep pace with these rapidly evolving threats. As one security expert noted, “Reactive approaches have proven inadequate – countermeasures must evolve in sophistication online and strength offline.”
EU Response Gaining Momentum
The European Union’s approach to combating disinformation has evolved significantly since 2015, when it established the East Stratcom Task Force. However, initial efforts were hampered by underfunding and limited enforcement powers.
The 2022 Digital Services Act (DSA) marked a significant step forward, introducing potential sanctions for online platforms failing to curb misinformation. Yet implementation remains challenging. TikTok is under multiple investigations but has faced no sanctions to date. Similarly, the European Commission has delayed its investigation into platform X (formerly Twitter) for alleged DSA violations, raising questions about political considerations in enforcement actions.
The European Parliament formed a specialized 33-member committee in December 2024 with a 12-month mandate to address information manipulation and interference, with a first report expected later this year.
Vulnerable Groups Face Greatest Risk
Certain populations face heightened vulnerability to manipulation. Women in public life or from minority backgrounds are frequently targeted by gendered disinformation designed to silence and discourage democratic participation. Roma communities, elderly populations, ethnic minorities, and teenagers are similarly exposed.
“The Roma community remains the most vulnerable,” notes advocacy group Roma for Democracy. “Their longstanding exclusion has fueled anti-system sentiment, which malign actors readily exploit.” Recent events in Romania’s elections confirmed this targeted approach.
Moving Beyond Reactive Measures
Experts recommend a comprehensive strategy to counter these threats. Key recommendations include investing in AI for efficient threat detection, reinforcing state institutions with proper mandates and training to investigate malicious actors, expanding platform accountability through rigorous DSA enforcement, and embedding civic education in defense planning.
The newly established European Democracy Shield, along with increased interagency cooperation, represents an evolving awareness of the need for proactive, cross-institutional responses. However, without stronger coordination between EU institutions, member states, and civil society organizations, Europe’s democratic processes remain vulnerable.
As information manipulation continues to evolve, combining online and offline tactics with increasing sophistication, the EU’s response must adapt accordingly. Without decisive action, not only future elections but broader social cohesion and democratic values face systemic risk in an increasingly polarized information environment.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


12 Comments
The examples cited, from Romania’s AI-driven fake accounts to Hungary’s media control, demonstrate the complex, cross-border nature of these threats. A unified EU response will be essential to effectively counter them.
The scope and scale of these threats is truly alarming. Addressing information manipulation and hybrid warfare will require a comprehensive, cross-border strategy backed by robust resources and political will.
While the challenges are daunting, I’m encouraged to see the EU taking this issue seriously and developing a coordinated response. Upholding democratic values and safeguarding the integrity of elections must remain a top priority.
Agreed. Remaining vigilant and adapting our defenses to counter evolving tactics will be crucial. With a united front, European democracies can overcome these hybrid threats.
Protecting democracy from information warfare and foreign interference is no easy task, but it’s critical. Strengthening cybersecurity, media literacy, and transparency around political funding and advertising will all be important parts of the solution.
Agreed. The evolving tactics, from online disinformation to physical sabotage, highlight the need for a comprehensive, multi-pronged approach to safeguard democratic institutions.
Strengthening media literacy and transparency around political funding and advertising are crucial steps. Empowering citizens to critically evaluate information and expose manipulation attempts is key to safeguarding democracy.
This issue cuts to the heart of democratic resilience. Maintaining public trust, combating disinformation, and securing critical infrastructure will all be essential in the fight to protect our democratic institutions.
This is a concerning trend that requires a robust and coordinated response from European authorities. Information manipulation and hybrid threats pose serious risks to democratic processes and stability. Addressing these challenges will be crucial in the years ahead.
Incidents like the vehicle sabotage in Germany underscore how hybrid threats can extend beyond the digital realm. Protecting against both online and offline disruption will be a key challenge for policymakers.
Absolutely. The expanded scope of these threats requires a holistic security approach that addresses both cyber and physical vulnerabilities. Coordinated intelligence-sharing and rapid response will be critical.
The rising threat of information manipulation is deeply concerning. Protecting democratic processes and institutions from these sophisticated, multifaceted attacks will require sustained, collaborative efforts across Europe.