Listen to the article
The DSA’s Impact on European Election Integrity: A Mixed Record
Social media platforms remain a significant concern for the integrity of democratic elections across Europe. With each major vote, reports surface about disinformation campaigns and foreign interference attempts. As these platforms have evolved into central arenas for public discourse, the uncomfortable reality is that a handful of private companies effectively set the rules of engagement. This dynamic is further complicated by Europe’s dependence on American tech giants amid growing transatlantic tensions over platform governance.
The EU’s Digital Services Act (DSA) and its systemic risk framework were designed as Europe’s primary defense against platform-driven threats to election integrity, with some observers even suggesting the legislation might “save democracy.” However, a retrospective analysis of three recent elections—Romania’s presidential election (November 2024), Germany’s federal election (February 2025), and Poland’s presidential election (May 2025)—reveals a sobering assessment of the DSA’s effectiveness in safeguarding online civic discourse and electoral processes.
The DSA’s systemic risk framework operates as a collaborative process, requiring platforms to conduct self-assessments complemented by input from researchers and civil society organizations. Its success depends critically on transparency regarding platform risks. Currently, there remains insufficient visibility into platform activity surrounding elections to properly evaluate whether mitigation measures are adequate. The DSA’s effectiveness also hinges on platforms’ willingness to cooperate—a willingness that appears to be diminishing among major U.S. tech companies, particularly as the Trump administration signals political support and threatens trade retaliation against European regulatory actions.
Under Articles 34 and 35, the DSA requires platforms to identify, assess, and mitigate “systemic risks” stemming from their services. These self-assessments must be published annually and are subject to independent audits. The process aims to reduce information asymmetries between platforms, regulators, and the public while integrating civil society expertise into oversight mechanisms. The legislation specifically identifies “negative effects on civic discourse and electoral processes” as one category of risk platforms must evaluate.
Across the three European elections examined, several risk patterns emerged. “Behavior-related” risks such as covert influence operations were widely reported. In Romania, intelligence services uncovered a coordinated influence campaign involving over 25,000 TikTok accounts. While platforms acknowledged these threats and claimed to have implemented effective countermeasures, problems persisted. For instance, in the German elections, observers noted a lack of proper election labels on relevant profiles, while the Polish elections saw attempts to impersonate political candidates.
Content-related risks, including electoral misinformation and disinformation, were reported across all three elections but did not appear to be the public’s primary concern. In Germany, political advertisements on Facebook contained instances of misinformation and hate speech. In Poland, TikTok videos falsely claiming election fraud garnered millions of views.
Three issues appear underappreciated in platforms’ risk assessments: political advertising, influencer activity, and asymmetric amplification of political candidates. In Romania, political content was mislabeled as entertainment on TikTok. During the Polish elections, a foreign interference campaign apparently utilized Facebook advertisements. The situation may have intensified since Meta and Google ceased serving political ads in October 2025. Additionally, undisclosed advertising by influencers emerged as a concern, with some Romanian influencers receiving payment to promote political candidates without disclosure.
Perhaps most troubling was the consistent reporting across all three elections of asymmetric amplification of political candidates, with algorithmic bias suspected on platforms like X during the German elections. While the European Commission offers guidance on mitigating biases in recommendation systems, platforms primarily focus on countering manipulation like fake engagement rather than addressing fundamental algorithmic imbalances in candidate visibility.
For all three elections, online platforms claimed in their risk assessments to have effective mitigation measures in place. However, observations by civil society organizations and media frequently contradicted these claims, highlighting failures to detect political advertising or identify election-related misinformation. What remains unclear is whether these failures constitute non-compliance under the DSA, as the available data provides only an incomplete picture.
The DSA and Commission guidelines do not establish clear operational benchmarks for what constitutes “reasonable, proportionate and effective” mitigation measures in an electoral context. Formal proceedings already underway against X, TikTok, Facebook, and Instagram related to systemic risks to civic discourse and electoral processes may ultimately provide important precedents for future elections.
The DSA’s collaborative regulatory approach faces a fundamental limitation when platforms refuse to cooperate. X, for example, continues to reject researchers’ requests to access platform data despite court losses and a €40 million European Commission fine. With diminishing incentives for cooperation among U.S. tech giants—particularly given the Trump administration’s threats of trade retaliation—the functioning of the systemic risk framework appears increasingly vulnerable.
European Commission Executive Vice-President Henna Virkkunen has promised more DSA enforcement decisions in the coming months. Any such findings will likely trigger censorship accusations from U.S. Republicans. Despite U.S. claims that the Commission is interfering in European elections, the Commission has yet to publish a decision related to platforms and electoral integrity. To maintain the credibility of the systemic risks framework, the Commission cannot allow predictable U.S. backlash to impede necessary enforcement actions.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


9 Comments
The role of social media platforms in shaping public discourse and influencing elections is a complex and concerning issue. I agree that the EU needs to find ways to bolster its digital sovereignty and develop more robust safeguards to protect the integrity of its electoral processes.
It’s worrying to see that the DSA has not lived up to its billing in safeguarding online civic discourse and electoral processes. The EU will need to closely examine the shortcomings and make necessary adjustments to strengthen the legislation. Maintaining election integrity is crucial for a healthy democracy.
Fascinating insights on the challenges around platform governance and election integrity. The evolving nature of disinformation campaigns and foreign interference makes this an ongoing battle. I’m curious to see how policymakers and tech companies can collaborate more effectively to address these threats.
The challenges around platform governance and election integrity are clearly not easy to solve. I’m curious to learn more about the specific issues the DSA has faced and what potential solutions the EU is considering to address them. Protecting the democratic process should be a top priority.
The reliance on American tech giants for online discourse is a complex issue, especially with growing transatlantic tensions. I wonder how the EU plans to address this dependency and strengthen its digital sovereignty in the long run.
Yes, that’s a critical point. The EU will need to find ways to reduce its reliance on US platforms and develop more homegrown digital infrastructure and governance models to truly safeguard its electoral processes.
The DSA was billed as a potential ‘savior of democracy’, so it’s concerning to see a ‘sobering assessment’ of its effectiveness so far. I hope the EU can learn from these recent elections and make the necessary adjustments to strengthen the legislation.
Interesting analysis on the DSA’s impact on election integrity. It’s concerning to see reports of continued disinformation and foreign interference attempts, even with this new legislation. I’m curious to learn more about the specific challenges and limitations the DSA has faced in the recent elections highlighted.
Interesting to see the mixed record of the DSA in protecting election integrity across recent European elections. The reliance on American tech giants and the evolving nature of disinformation campaigns make this an ongoing battle. I hope the EU can learn from these experiences and find ways to strengthen its digital sovereignty and safeguards.