Listen to the article

0:00
0:00

In the wake of recent European elections, questions are mounting about the effectiveness of the EU’s Digital Services Act (DSA) in safeguarding electoral integrity on social media platforms. An examination of elections in Romania, Germany, and Poland reveals significant challenges in implementing the regulatory framework designed to combat online threats to democratic processes.

Social media platforms have become central arenas for public discourse, but the rules governing these spaces remain largely in the hands of a few private companies. While Europe has positioned the DSA as its primary defense against platform-driven electoral harm, recent evidence suggests the legislation may not be living up to expectations.

The DSA’s systemic risk framework requires major platforms to identify, assess, and mitigate risks stemming from their services, particularly those affecting “civic discourse and electoral processes.” This self-assessment model is meant to foster collaboration between platforms, regulators, and civil society, with platforms publishing annual reports and submitting to independent audits.

In March 2024, the European Commission published guidelines outlining recommended measures to protect election integrity, but implementation has been inconsistent across platforms.

Across the Romanian, German, and Polish elections, several recurring issues emerged. Coordinated influence operations were prevalent, with Romanian intelligence services uncovering a campaign involving over 25,000 TikTok accounts. While TikTok claimed to have banned these accounts, questions remain about detection timing and effectiveness.

Improper labeling of political content was another common problem. In Germany, many political profiles lacked proper election labels, while in Poland, there were attempts to impersonate candidates. Though platforms claimed to have mitigation measures in place, external observers frequently documented failures.

Electoral misinformation and disinformation appeared in all three elections. During the German campaign, political advertisements on Facebook contained misinformation and hate speech. In Poland, TikTok videos falsely claiming election fraud garnered millions of views. Despite these issues, content-related risks do not appear to have been the central public concern for election integrity.

Three issues seem underappreciated in platform risk assessments: political advertising irregularities, undisclosed influencer promotions, and algorithmic bias leading to asymmetric amplification of political candidates. In Romania, political content was mislabeled as entertainment on TikTok, while some influencers received undisclosed payments for promoting candidates. The situation may have worsened since Meta and Google stopped serving political ads in October 2025.

Civil society organizations across all three countries reported concerns about algorithmic bias, particularly on X (formerly Twitter) during the German elections. While the Commission offers guidance on mitigating manipulation in recommender systems, platforms primarily focus on combating fake engagement rather than ensuring balanced political reach.

The effectiveness of the DSA’s framework is difficult to evaluate due to limited data transparency. While platforms claim to have effective mitigation measures, external observations frequently contradict these assertions. The DSA and Commission guidelines lack clear operational benchmarks for what constitutes “reasonable, proportionate and effective” mitigation measures in electoral contexts.

The European Commission has launched formal proceedings against several platforms, including X, TikTok, Facebook, and Instagram, related to systemic risks to electoral processes. These cases could establish important precedents for future elections.

A fundamental challenge to the DSA’s collaborative approach is emerging as some platforms become increasingly non-cooperative. X continues to deny researchers access to platform data despite court rulings and a €40 million fine. With the Trump administration threatening trade retaliation against EU platform regulation, US-based tech giants may have diminishing incentives to comply with European rules.

Commission Executive Vice-President Henna Virkkunen has promised more DSA enforcement decisions in the coming months. To maintain the credibility of the systemic risks framework, the Commission cannot allow potential US backlash to deter enforcement actions, particularly as platforms’ role in electoral integrity remains a critical concern for European democracy.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

10 Comments

  1. Ensuring fair and trustworthy elections is crucial for a healthy democracy. While the DSA aims to address digital threats, the self-assessment model and reliance on platform cooperation presents some clear limitations. More robust oversight and enforcement mechanisms may be needed.

    • William Thompson on

      You make a good point. Effective regulation requires strong, independent oversight rather than relying solely on self-reporting by the platforms themselves.

  2. This is an important topic that deserves close scrutiny. The article raises valid questions about whether the DSA, as currently structured, is up to the task of protecting electoral integrity from online manipulation and misinformation. Continued monitoring and refinement of the regulatory framework will be crucial.

  3. Elizabeth Lopez on

    The challenges of regulating social media platforms in the context of elections are significant. While the DSA aims to be a comprehensive solution, the article highlights some of the limitations and implementation issues that have emerged. Striking the right balance between platform accountability and free speech will be an ongoing challenge.

  4. The role of social media in elections is a complex and concerning issue. I appreciate the article’s balanced analysis of the DSA’s strengths and shortcomings. It highlights the need for innovative, multi-stakeholder solutions to safeguard democratic processes in the digital age.

  5. John Rodriguez on

    This is an interesting look at the challenges of protecting electoral integrity in the digital age. The DSA seems like a well-intentioned effort, but the article raises valid concerns about its ability to effectively regulate social media platforms and mitigate online threats to democratic processes.

  6. Patricia Rodriguez on

    Safeguarding election integrity in the digital age is a critical concern. The article provides a thought-provoking analysis of the DSA’s performance to date. Effective regulation will require innovative approaches that keep pace with the evolving landscape of online discourse and manipulation.

  7. Patricia Thomas on

    Interesting to see the DSA’s implementation evaluated across different European elections. The findings suggest more work is needed to address platform-driven electoral risks, though the overall goal of the legislation is a worthy one. Continued refinement and stakeholder engagement will be crucial.

  8. Liam I. Miller on

    This is a complex issue with no easy answers. The article does a good job of outlining the strengths and weaknesses of the DSA’s approach. Ongoing collaboration between policymakers, tech companies, and civil society will be essential to finding solutions that protect democratic processes.

  9. Elizabeth Lopez on

    Interesting to see the DSA’s performance evaluated across different European elections. The findings suggest more work is needed to address platform-driven electoral risks, though cooperation between regulators, civil society, and tech companies is a step in the right direction.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.