Listen to the article
Social Media Platforms Failing to Moderate Election Misinformation on Short-Form Video Features
As TikTok’s popularity has surged over the past three years, other social media giants have rushed to compete by launching their own short-form video features. Meta introduced Instagram Reels in 2020, the same year YouTube debuted YouTube Shorts. Both companies have heavily invested in attracting content creators, with Instagram announcing a $1 billion investment in Reels creators and YouTube developing a revenue-sharing model that gives creators 45% of ad revenue from Shorts.
However, this rapid deployment of new features has come with significant oversight problems. According to research from the Institute for Strategic Dialogue (ISD), these platforms have failed to properly assess risks or prioritize safety when launching their short-form video products, instead building them atop existing content moderation systems designed for different formats.
The consequences have become increasingly apparent as the U.S. approaches the midterm elections. ISD researchers found election misinformation and conspiracy theories readily available across YouTube Shorts, Instagram Reels, and TikTok, despite all three platforms having specific policies against such content.
On YouTube Shorts, analysts discovered numerous videos promoting the widely debunked “2000 Mules” conspiracy film, which falsely claims the 2020 presidential election was stolen through ballot trafficking. Several of these Shorts even included links to view the full film, with none featuring information labels to counter the misinformation, despite YouTube’s stated policies against election integrity violations.
Instagram Reels proved equally problematic. Under the hashtag #electionintegrity, which has 19,600 posts, researchers found multiple videos making false claims about mail-in ballots and “dead voters” supposedly used to manipulate elections. Despite Meta’s policy claiming such content would be removed from Explore and Reels feeds, none of the first 100 videos with this hashtag featured any fact-checking labels.
TikTok showed some effort to restrict certain hashtags like #2000mules, but users easily circumvented these measures by using slightly altered tags like #2kmules. The platform also permitted hashtags such as #electionstolen (314,600 views) and #riggedelectionproof (368,800 views), with most videos amplifying false claims about the 2020 election being rigged.
The inconsistency in applying moderation policies is particularly troubling given the significant revenue these features generate. Meta has predicted a $1 billion revenue run rate from Instagram Reels, while YouTube reports Shorts amassing more than 30 billion daily views.
“Inconsistent and insufficient moderation policies are currently assisting bad actors seeking to undermine voter faith in U.S. elections and worsening political tension surrounding Election Day,” the ISD report concluded.
The research has significant implications for upcoming elections, suggesting that platforms have prioritized market demands over safety and effective content moderation. The findings demonstrate how these short-form video features have become vectors for election misinformation despite company policies explicitly prohibiting such content.
Industry experts recommend that platforms improve their approach by consistently applying information labels to videos with hashtags known to be associated with election misinformation, better tuning recommendation systems to promote credible content, and consistently enforcing existing policies across all video formats.
As the 2022 midterms approach and the 2024 presidential election looms, the failure to address these shortcomings could further erode public trust in electoral processes and potentially contribute to voter suppression and political polarization.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


20 Comments
It’s disappointing to see social media companies prioritizing growth and engagement over the integrity of our electoral process. Responsible governance is sorely lacking here.
Agreed, this is a systemic issue that goes beyond just one or two platforms. The entire industry needs to fundamentally rethink its approach to content moderation.
Alarming to see how election misinformation is spreading on social media’s short-form video features. Proper content moderation is clearly lacking, putting the integrity of elections at risk.
You’re right, the platforms need to do much more to address this issue responsibly. Prioritizing engagement and growth over safety is a dangerous approach.
This is a concerning trend, as short-form video’s immersive and engaging nature makes it a powerful vector for the spread of election disinformation. Better safeguards are urgently needed.
Absolutely. The platforms must take a more proactive and comprehensive approach to content moderation, especially for sensitive topics like elections.
The rapid expansion of short-form video has outpaced these platforms’ ability to properly moderate content. Combatting election-related disinformation should be a top priority.
Agreed. The platforms need to invest heavily in proactive detection and removal of misleading election claims, rather than relying on users to report issues.
It’s deeply concerning to see how easily election disinformation can spread on social media’s short-form video features. This highlights the urgent need for comprehensive reform.
Completely agree. The platforms’ current approaches are clearly inadequate, and they need to make this a top priority before the next major election cycle.
The spread of election disinformation on short-form video is extremely concerning. Platforms need to urgently address this issue to protect the integrity of our democratic processes.
Absolutely. This is a systemic problem that requires a comprehensive, multi-faceted solution from both the platforms and policymakers.
Troubling to read about the spread of election-related misinformation on short-form video. These platforms must do more to combat this issue and protect the democratic process.
Absolutely. Robust content moderation and fact-checking measures are critical, especially for emerging, high-engagement formats like TikTok and Reels.
This is a deeply concerning trend that undermines public trust in our democratic institutions. Social media platforms need to take urgent action to address this issue.
Absolutely. The integrity of our elections should be the top priority, not growth and engagement metrics. Comprehensive reform is clearly needed.
This is a worrying development. Social media platforms must find ways to better identify and remove election-related misinformation, even on fast-moving, high-engagement formats.
You’re right, the stakes are too high to allow this kind of disinformation to spread unchecked. Proactive, transparent, and accountable content moderation is essential.
It’s alarming to see how easily election misinformation can proliferate on short-form video features. The platforms must take this threat much more seriously.
Agreed. Proactive detection and removal of misleading content, along with increased transparency, should be top priorities for these companies.