Listen to the article
Major Tech Platforms Fall Short on EU Disinformation Code Compliance
Social media giants are facing criticism for inadequate progress in combating disinformation across the European Union, despite signing onto a voluntary code of practice aimed at tackling fake news and misinformation.
Industry watchdogs and fact-checking organizations point to significant gaps in implementation, with most major platforms showing minimal results more than six months after the code took effect. When reached for comment, Twitter failed to respond before publication, while Google issued a statement expressing its commitment to “making the Code of Practice a success.” Meta defended its efforts, stating it has built “the largest global fact-checking network of any platform” and invested heavily in combating falsehoods.
However, critics remain unconvinced. Analysts like Nicotra highlighted Google’s lack of progress in collaborating with fact-checkers and criticized the company for falling “really behind on transparency and access to data” for independent researchers. TikTok received somewhat better marks but still faces concerns, with Nicotra noting that the platform’s algorithm “is still massively accelerating disinformation” despite some improvements.
TikTok has taken more visible action compared to its competitors. The video-sharing platform reported removing nearly 865,000 fake accounts across the European Union between June and December 2022. These accounts had accumulated over 18 million followers before being taken down. The geographic distribution of these fake accounts was substantial, with over 175,000 in France, 155,000 in Spain, and more than 138,000 in Germany. Additionally, the platform banned over 4,000 accounts for impersonation, including 842 in Germany and approximately 500 each in both France and Poland.
Despite these efforts, Carlos Hernández, head of public policy at the Spanish-language fact-checking organization Maldita.es, expressed disappointment with the overall industry response. “The Code has been in place for more than half a year and most larger platforms have very little to show up for [it], maybe with the exception of Meta and TikTok to a lesser degree,” Hernández stated.
Even for platforms like TikTok that have provided more detailed information than their competitors, experts question the trustworthiness of self-reported figures. The absence of external audits and continuing barriers to data access for independent researchers make it difficult to verify companies’ claims about their disinformation efforts.
TikTok appears to recognize this criticism. Caroline Greer, the company’s head of EU public affairs, acknowledged the need for greater transparency and confirmed that the platform is working on expanding access for researchers. “It’s an example of one project where it’s not in place today, but it’s in progress,” Greer said.
The struggle to implement effective anti-disinformation measures comes at a critical time for social media companies operating in Europe. The EU has been progressively tightening regulatory requirements for digital platforms, with the Digital Services Act set to impose stricter obligations on content moderation and algorithmic transparency.
Industry observers note that the voluntary code was meant to demonstrate tech companies’ willingness to self-regulate ahead of more stringent legislation. The apparent failure to deliver substantial results may strengthen arguments for mandatory regulations with enforcement mechanisms rather than voluntary commitments.
As election cycles approach in several European countries, the pressure on platforms to demonstrate effective action against disinformation is likely to intensify. Public concern about the role of social media in spreading false information remains high, particularly after documented instances of foreign interference in previous electoral processes across the continent.
Verify This Yourself
Use these professional tools to fact-check and investigate claims independently
Reverse Image Search
Check if this image has been used elsewhere or in different contexts
Ask Our AI About This Claim
Get instant answers with web-powered AI analysis
Related Fact-Checks
See what other fact-checkers have said about similar claims
Want More Verification Tools?
Access our full suite of professional disinformation monitoring and investigation tools


12 Comments
Concerning to hear about the gaps in how major platforms are tackling disinformation. More transparency and collaboration with fact-checkers is crucial to curbing the spread of misinformation online.
I agree. It’s time for these companies to step up and take real responsibility for the content on their platforms.
It’s concerning to see social media companies falling short on tackling disinformation. Transparency and cooperation with fact-checkers are critical to curbing the spread of misinformation. Hope they can improve their efforts to uphold the code of practice.
Agreed. Platforms need to take more accountability and work proactively with independent organizations to address this growing problem.
This lack of progress on the EU disinformation code is very troubling. Social media platforms have a responsibility to their users to combat the spread of misinformation, and they’re clearly falling short.
Totally agree. More transparency and cooperation with fact-checkers is the bare minimum they should be doing.
The criticism of social media companies’ failure to adequately implement the EU disinformation code is justified. Platforms need to do much more to fulfill their commitments and protect users from the harms of fake news.
Absolutely. Stronger oversight and enforcement are needed to hold these companies accountable.
It’s disappointing to see major tech companies failing to fulfill their commitments under the EU disinformation code. Tackling fake news and misinformation should be a top priority, not an afterthought.
Couldn’t agree more. These platforms need to step up and take real action, not just pay lip service to the problem.
The lack of progress in implementing the EU’s voluntary code is disappointing. Social media giants should be doing more to fulfill their commitments and protect users from the harms of online falsehoods.
Exactly. There needs to be stronger enforcement and consequences for non-compliance with the code.