Listen to the article

0:00
0:00

Social media platforms have intensified efforts to combat foreign influence and misinformation during the UK’s recent general election, though questions remain about their year-round commitment to addressing these issues.

TikTok, X (formerly Twitter), and Meta—parent company of Instagram and Facebook—have all invested in measures they claim will shield users from manipulation online. Representatives from these platforms have emphasized their commitment to ensuring users receive reliable information throughout the election period.

In a notable development, X responded to allegations of misinformation for the first time since Elon Musk’s acquisition of the platform. Following investigations, all major platforms removed problematic posts and accounts flagged by researchers and journalists.

However, many of the deceptive tactics employed during this election cycle weren’t new. They had been refined and deployed by political activists well before Prime Minister Rishi Sunak called for the general election. For instance, the network of accounts that spread manipulated videos and false statements about Labour MP Wes Streeting had previously targeted Labour leader Keir Starmer during a by-election in February 2024.

“As someone who investigates social media’s real-world impact all year round, it feels like some of the companies often only really wake up and take action during an election period,” noted one social media researcher who has monitored these platforms extensively.

This pattern raises concerns about the reactive rather than proactive approach taken by tech companies. While platforms ramp up resources and enforcement during high-profile electoral events, the same vigilance isn’t consistently applied outside these periods.

Media analysts point to a fundamental shift in how we should conceptualize the relationship between social media and politics. The notion of a discrete “social media election” has become obsolete. Instead, public opinion and political discourse are continuously shaped by content circulating on social feeds and private messaging groups long before and after any official vote takes place.

This reality makes the intermittent nature of platform enforcement particularly problematic. By the time an election is called, narratives have already been established and audiences primed through months or years of exposure to certain messaging frameworks.

Despite early concerns about artificial intelligence potentially disrupting the election through sophisticated deepfakes, traditional misinformation tactics proved more prevalent. The focus on AI threats may have diverted attention from long-standing issues with social media algorithms and well-established misinformation techniques.

Industry experts suggest that while AI safeguards are necessary for future electoral integrity, they shouldn’t overshadow the need for comprehensive regulatory frameworks addressing how social platforms operate year-round.

The UK’s Online Safety Act, which passed in 2023, represents a step toward greater regulation, but many of its provisions have yet to be fully implemented. Meanwhile, platforms continue to operate largely according to their own internal policies and enforcement priorities.

Media literacy advocates emphasize that regardless of platform policies, the public needs better tools to navigate an increasingly complex information landscape. Critical thinking skills and the ability to verify information sources have become essential democratic competencies.

As the dust settles on this election, the fundamental questions about social media’s role in democratic processes remain largely unanswered. Rather than being a “deepfake election,” this vote highlighted how established misinformation tactics continue to evolve while regulatory solutions lag behind.

The challenge for policymakers, platforms, and the public going forward will be developing approaches that address these issues continuously, not just during electoral periods when the spotlight is brightest.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

26 Comments

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.