Listen to the article

0:00
0:00

Social media platforms are exploring a simple yet powerful method to combat the spread of misinformation: adding small barriers to make sharing content slightly more difficult, according to new research from the University of Copenhagen.

While platforms like Facebook, Instagram, and X have designed sharing functionality to be seamless with just a click, this frictionless design may be contributing to the rapid spread of false information. Studies have consistently shown that misinformation spreads faster on social media than factual content, partly because platform algorithms tend to amplify sensational posts that generate high engagement.

The new study, published in the journal npj Complexity, suggests a straightforward solution to this growing problem. “Our idea is to introduce a small pause in the sharing process to make people reflect on what they’re sharing before clicking the button,” explains lead author Laura Jahn, a PhD student involved in the research alongside Professor Vincent F. Hendricks.

The researchers developed a computer model simulating information flow across platforms like X, Bluesky, and Mastodon. Their findings indicate that introducing small digital frictions—such as pop-up messages—can effectively reduce impulsive content sharing behaviors that often fuel misinformation campaigns.

However, the team discovered that simply reducing sharing volume alone doesn’t necessarily improve content quality. To address this limitation, they enhanced their model with an educational component that users encounter before sharing posts.

“It could be a pop-up with a short quiz asking questions like: How is misinformation defined, and what does this social media platform do to limit fake news?” Hendricks explains. “The idea is that this learning element will prompt users to reflect on their behavior and share fewer problematic posts.”

The results are promising. When friction was combined with educational elements, the average quality of shared content increased significantly according to the model’s predictions.

This research comes at a critical time as social media platforms face mounting pressure from regulators, advertisers, and users to address the spread of false information. Major platforms have experimented with various interventions in recent years, including warning labels and reducing the visibility of disputed content, but misinformation continues to flourish online.

The approach proposed by the Danish researchers represents a behavioral design solution that doesn’t require complex content moderation systems or artificial intelligence to implement. Instead, it leverages basic principles of human psychology to encourage more thoughtful engagement.

The researchers now aim to test their model’s predictions in real-world settings. “We hope our proposal will inspire tech giants to think innovatively in the fight against misinformation. They could help us test the promising computer model to see whether engagement with low-quality content decreases and whether users become better at recognizing misinformation in real situations,” say Jahn and Hendricks.

If partnerships with major platforms prove difficult, the team plans to utilize research-focused simulated platforms to gather empirical data on their intervention’s effectiveness.

The study was conducted at the University of Copenhagen’s Center for Information and Bubble Studies, which specializes in researching information ecosystems and cognitive biases in digital environments.

As social media continues to play an increasingly central role in how people consume news and information, finding practical, implementable solutions to misinformation remains a critical challenge for platforms, researchers, and society at large.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

6 Comments

  1. Liam P. Rodriguez on

    Anything platforms can do to slow the spread of misinformation is worth considering. A small pause before sharing seems like a relatively low-impact change that could have meaningful benefits.

  2. Isabella Garcia on

    I’m skeptical that adding friction to sharing will significantly reduce misinformation. Dedicated bad actors will find ways around any barriers, while well-intentioned users may simply get frustrated. A more holistic approach is needed.

    • Isabella Rodriguez on

      That’s a fair point. Friction alone won’t solve the complex problem of misinformation. Platforms will need to use a variety of tactics, including improving content moderation and promoting media literacy.

  3. Amelia Williams on

    This research highlights an interesting tension between user experience and content integrity on social platforms. I’m curious to see how different networks approach this challenge in their own ways.

  4. Robert Rodriguez on

    Interesting approach to combating misinformation. A small pause before sharing could encourage more reflection on content before spreading it. Curious to see how effective these friction points are in practice.

  5. Lucas Thompson on

    Restricting online sharing seems like a reasonable way to slow the rapid spread of misinformation. Platforms should find the right balance between convenience and responsibility when it comes to content distribution.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.