Listen to the article

0:00
0:00

In what’s being called a “super-election year,” voters across more than 50 countries will head to the polls amid rising concerns about online information manipulation. The increasing availability of artificial intelligence tools has created new avenues for malicious actors seeking to undermine democratic processes and erode public trust in elections.

Despite these mounting threats, major social media platforms like Facebook, Instagram, YouTube, and X (formerly Twitter) have been scaling back safeguards designed to combat harmful content and disinformation. Even more concerning is the rise of smaller alternative platforms such as Telegram, Mastodon, and Bluesky, which typically have fewer resources dedicated to preserving information integrity during critical election periods.

The decentralization of online communication presents a double-edged sword. While it provides users with diverse platform options, it simultaneously makes election misinformation more difficult to track and prevent. This fragmentation creates blind spots where disinformation can flourish unchecked.

Recent elections worldwide have illustrated these vulnerabilities. During Kenya’s 2022 elections, the Mozilla Foundation documented how TikTok became a conduit for rapidly spreading political disinformation that exploited civilians’ fears of post-electoral violence to intimidate voters and heighten political tensions.

Regional platforms operating outside U.S. jurisdiction, such as Kakao, Line, and Viber, which have substantial user bases in specific countries, often operate under different content moderation standards. Their policies may be developed ad hoc, reflect more permissive community standards, or lack sufficient safeguards against hate speech and disinformation.

The upheaval at X following Elon Musk’s acquisition has created opportunities for alternative platforms. Mastodon and Bluesky have gained traction, while Discord, Clubhouse, Twitch, and Gab have also seen increased usage. However, none have yet achieved the scale of established social media giants, suggesting that the future online landscape will likely remain distributed across various platforms and communities.

In Iran’s 2021 presidential elections, Clubhouse emerged as a forum for electoral debates, with candidates using the app’s “rooms” to discuss their political objectives with thousands of participants. Yet the platform’s privacy policies, which connect accounts to phone numbers and national IDs, raised serious concerns about user safety in a restrictive political environment.

Brazil’s 2022 presidential election saw Telegram become the dominant messaging platform. Researchers at the Brazilian Internet Lab identified disinformation campaigns on the platform calling for military interventions and questioning verified election results. Telegram only addressed these issues after Brazil’s Supreme Electoral Court ordered its blocking, which led to a memorandum of understanding between the company and the government.

While the Brazilian case represents a partial success, it resulted from specific factors unique to that country and is unlikely to provide a repeatable model for global content moderation. The challenge remains: how can stakeholders engage with smaller platforms that often resist participation from civil society, public sector entities, and election management bodies?

Experts suggest several approaches to ensure social media platforms positively impact democratic processes. Civil society organizations, academics, and investors must work together to provide policy and technical oversight while supporting platforms even in countries with limited legal regulations.

One promising strategy involves open-source governance models similar to those used by Wikipedia and Reddit, where communities of moderators help shape platform policies. Academic institutions like Indiana University’s Observatory on Social Media have created dashboards tracking top disinformation spreaders, while open-source human rights impact assessments can measure platforms’ effects on elections.

Multi-stakeholder coalitions can conduct systemic risk assessments and develop crisis protocols for addressing electoral issues. Examples include the Meta Oversight Board, the Christchurch Call to Action, and Article 19’s Social Media Council concept, which provide formal or informal oversight mechanisms.

Human rights-minded investors could play a crucial role by requiring startups to meet certain information integrity criteria, such as transparency around content moderation and privacy protection. After a recent incident involving racial slurs in usernames, Bluesky announced increased investment in its trust and safety team, reportedly following pressure from investors.

Technology can be part of the solution as well. “Democracy bots” could be developed to flag inauthentic political accounts and deepfake content or provide users with information about platform moderation activities. Platforms could incentivize these developments by rewarding developers who create tools aligned with democratic principles.

App stores also wield significant influence. Apple and Google have previously required platforms like Telegram to implement basic content policies. However, using app stores as the sole enforcement mechanism has limitations, potentially leading to disproportionate responses or superficial compliance.

As authoritarian countries build their own digital ecosystems that disregard democratic principles, the need for collaborative efforts among platforms, civil society, investors, and users becomes increasingly urgent. By addressing the unique challenges posed by diverse online platforms and advocating for democratic values, stakeholders can work toward a more inclusive and democratic digital environment during this critical global election year.

Verify This Yourself

Use these professional tools to fact-check and investigate claims independently

Reverse Image Search

Check if this image has been used elsewhere or in different contexts

Ask Our AI About This Claim

Get instant answers with web-powered AI analysis

👋 Hi! I can help you understand this fact-check better. Ask me anything about this claim, related context, or how to verify similar content.

Related Fact-Checks

See what other fact-checkers have said about similar claims

Loading fact-checks...

Want More Verification Tools?

Access our full suite of professional disinformation monitoring and investigation tools

10 Comments

  1. Isabella Miller on

    This issue highlights the need for ongoing vigilance and innovation in the fight against online manipulation. As technology continues to evolve, so must our strategies for protecting the democratic process. Staying one step ahead of bad actors will be crucial.

  2. Robert Rodriguez on

    The rise of alternative social media platforms is concerning, as they may lack the resources to properly combat election-related disinformation. Coordinated efforts between platforms, governments, and civil society will be crucial to address this challenge.

  3. Patricia Moore on

    The decentralization of online communication presents both opportunities and risks. While it provides users with more choices, it also creates new avenues for the spread of harmful content. Effective solutions must address this delicate balance.

    • Good point. Policymakers will need to carefully consider how to maintain the benefits of platform diversity while also ensuring robust safeguards against manipulation and misinformation.

  4. This is a critical issue that deserves serious attention. Safeguarding the integrity of elections in the face of evolving manipulation tactics will be a significant challenge, but one that must be met head-on to protect the foundations of democracy.

  5. Maintaining public trust in elections is essential for democracy. Tackling the challenge of online disinformation will require a multifaceted approach involving enhanced platform moderation, user education, and international cooperation. It’s a complex issue, but the stakes are too high to ignore.

    • Olivia Hernandez on

      Agreed. Transparency and accountability from social media companies will be crucial. They need to demonstrate a firm commitment to protecting the integrity of the electoral process.

  6. This is a concerning trend as we approach a critical election year. Safeguarding information integrity should be a top priority for social media platforms and policymakers. Proactive strategies are needed to counter the rise of AI-enabled manipulation and fragmented communication channels.

  7. Addressing the complex challenge of online disinformation will require a multifaceted, collaborative approach. Enhancing platform moderation, improving user digital literacy, and fostering international cooperation are all important pieces of the puzzle.

  8. The vulnerabilities exposed in recent elections worldwide are a wake-up call. Developing robust, adaptable frameworks to counter social media manipulation should be a top priority for policymakers and tech companies alike. Failure to do so could undermine public trust in elections.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved. Designed By Sawah Solutions.