Listen to the article
Moldova’s recent parliamentary election on September 28 reaffirmed the country’s pro-European path, with the ruling Party of Action and Solidarity securing the largest share of seats amid record participation from Moldovan voters living abroad. Yet beneath this democratic exercise, a more troubling narrative unfolded across social media platforms, exposing a critical vulnerability in Europe’s information ecosystem.
The election highlighted a growing disconnect between national regulatory frameworks and the borderless reality of online platforms, creating a vacuum where disinformation thrives.
In July 2025, Moldova’s parliament revised its Audiovisual Media Services Code with aims to align with European standards, fight disinformation, and promote responsible media. The law expanded the Audiovisual Council’s authority to address “false information” and “manipulation.” However, it failed to provide clear legal definitions for these terms, turning enforcement into an inconsistent process.
More problematically, Moldova’s regulatory framework only applies to platforms with a legal presence inside the country. Major social media companies like Meta, Google, and TikTok operate without a direct Moldovan presence, meaning their content moderation follows corporate guidelines rather than national laws. This creates a striking imbalance: local broadcasters face fines for biased reporting, while viral social media posts reaching hundreds of thousands of Moldovans remain largely unregulated.
This regulatory gap has created fertile ground for foreign influence operations. Investigators have tracked how Ilan Shor, an exiled businessman under U.S. sanctions, continued funding social media advertising for his banned political party from abroad. In 2024, researchers identified more than 100 Facebook pages connected to his network, collectively generating hundreds of millions of views and approximately $200,000 in revenue for the platform.
These campaigns portrayed protests as spontaneous uprisings, attacked European integration efforts, and undermined trust in Moldovan institutions. When Meta removed some of these pages, mirror campaigns quickly emerged under new names.
By 2025, these tactics had evolved into more sophisticated operations. REST Media flooded TikTok, Telegram, and YouTube with anti-EU narratives. Cybersecurity researchers later linked this operation to Rybar, a Russian influence network known for repackaging Kremlin messaging through AI-generated voices and translated scripts.
Promo-LEX, Moldova’s leading election observer organization, identified approximately 500 coordinated accounts promoting nearly identical content during just a three-day period of the campaign. Some videos accumulated over 1 million views, often artificially boosted by inauthentic engagement. Each interaction fed what observers call “the commercialization of deception” – an invisible economy where dark money purchases reach and platforms profit from the traffic.
The real threat to democratic discourse isn’t simply censorship but inauthentic speech – content generated or amplified by fake, automated, or paid accounts that simulate public consensus and distort genuine debate. When sanctioned figures can still purchase digital influence through intermediaries, the result undermines authentic public discourse.
Addressing these challenges requires more substantial action from platforms. They need regional moderation teams with local-language expertise empowered to respond rapidly to emerging threats. Political advertising must meet rigorous transparency standards regarding funding sources and intermediaries. When financial trails disappear into shell companies or opaque agencies, such advertisements should be rejected.
Moldova’s experience also highlights limitations in the European Union’s regulatory reach. While the Digital Services Act demands accountability from tech companies within EU borders, countries on Europe’s periphery remain particularly vulnerable to digital interference. These emerging democracies are effectively serving as testing grounds for information manipulation tactics that could later target larger Western nations.
Despite weeks of coordinated disinformation aimed at eroding public trust and suppressing voter turnout, Moldova’s election ultimately proceeded successfully. Voters navigated through manipulation and media bias to make their democratic choice. However, democratic resilience should not be the minimum standard for success.
As long as algorithms amplify deceptive content faster than institutions can counter it, smaller European democracies will continue facing significant challenges. The question remains whether tech platforms will begin treating these regions as strategic priorities rather than low-importance markets, recognizing that their business models now significantly influence political outcomes across Europe’s eastern edge.
Protecting freedom of speech increasingly means safeguarding authenticity – ensuring that online voices represent real citizens rather than automated or coordinated inauthentic networks. If platforms continue monetizing manipulation, they risk undermining the very democratic systems that enable their operation, potentially inviting more severe regulatory responses in the future.
Verify This Yourself
Use these professional tools to fact-check and investigate claims independently
Reverse Image Search
Check if this image has been used elsewhere or in different contexts
Ask Our AI About This Claim
Get instant answers with web-powered AI analysis
Related Fact-Checks
See what other fact-checkers have said about similar claims
Want More Verification Tools?
Access our full suite of professional disinformation monitoring and investigation tools


12 Comments
The lack of clear definitions around ‘false information’ and ‘manipulation’ is concerning. This could lead to overzealous censorship or inconsistent application of the rules.
Good point. Striking the right balance between addressing disinformation and protecting free speech is crucial. Careful legal drafting is required to provide clarity and safeguards.
This highlights the need for greater transparency and accountability from tech companies operating across borders. They need to do more to secure the integrity of democratic processes.
Absolutely. Self-regulation has proven inadequate – clear legal frameworks and enforcement mechanisms are needed to hold platforms responsible for harmful content.
Interesting to see the challenges Moldova faces with foreign disinformation campaigns targeting its elections. Effective media regulation seems crucial, but defining and enforcing ‘false information’ is clearly a complex issue.
You’re right, the borderless nature of online platforms makes domestic regulation a real challenge. Tackling disinformation requires a coordinated international approach.
Moldova’s efforts to align with European standards on media regulation are admirable, but the enforcement challenges highlight the need for greater international cooperation and harmonization of rules.
Agreed. The EU could play a leading role in developing a comprehensive framework to address cross-border disinformation, leveraging its regulatory power and influence.
This is a complex issue without easy solutions. Domestic laws can only do so much when the platforms and information flows are global. A more collaborative, multilateral approach seems essential.
The mismatch between national regulations and the borderless nature of online platforms is a key vulnerability. Innovative, cross-border solutions are needed to address this challenge.
The influence of foreign actors like Russia in Moldova’s elections is concerning. Strengthening democratic resilience and media literacy should be priorities to counter these threats.
Absolutely. Building public awareness and critical thinking skills is crucial to empower citizens to navigate the information landscape and identify manipulation attempts.