Listen to the article

0:00
0:00

The far-reaching impact of social media algorithms on society has become a growing concern for journalists and researchers alike. These complex systems, which determine what content appears in users’ feeds, have been linked to political instability, mental health crises, and the degradation of personal relationships worldwide.

For investigative journalists, exposing the inner workings of these algorithms presents a unique challenge. The companies behind major platforms like Meta and TikTok closely guard their algorithmic processes, forcing reporters to develop creative methods for holding these systems accountable. While some investigations require technical expertise in digital forensics or coding, many compelling stories can be produced without specialized skills.

“Not only are algorithms on social media deeply complex, the companies that produce them will also not disclose how they work,” explains media researcher Lam Thuy Vo. This opacity often necessitates reverse-engineering approaches or focusing on demonstrating harmful outcomes rather than explaining technical details.

At their core, social media algorithms are sets of rules designed to evaluate user data – including likes, connections, and interaction history – to predict what content will generate the most engagement. These systems have grown increasingly sophisticated over time. According to a 2021 Washington Post investigation, Facebook’s algorithm analyzed approximately 10,000 signals to determine what content would appear in a user’s feed.

The terminology surrounding harmful content is crucial for journalists to understand. Misinformation refers to false information spread by those who believe it to be true, often occurring during breaking news events. Disinformation, meanwhile, is deliberately created to mislead or cause harm, such as the Russian disinformation campaigns targeting the 2016 U.S. presidential election. Hate speech, which promotes violence against specific groups, represents another serious concern, exemplified by content that fueled violence against Myanmar’s Rohingya population.

Journalists investigating algorithmic harms typically employ three main approaches. The first examines how audiences experience algorithmic feeds, often through what researchers call a “quantified selfie” – analyzing an individual’s social media data. For example, reporter Malick Gai collected TikTok viewing archives from Senegalese migrants who had been exposed to misleading immigration information on the platform.

NBC News reporter Kat Tenbarge used a similar approach to expose groups promoting eating disorders on X (formerly Twitter), finding communities with up to 173,000 users, some as young as 13. The documentary “Can’t Look Away” featured whistleblowers who revealed how TikTok’s algorithm creates dangerous echo chambers by focusing on time spent rather than intentional searches.

The second investigative approach targets content creators who exploit algorithms to spread harmful information. Code for Africa uncovered a network of 16 Facebook accounts systematically spreading misinformation about alleged arrests of pro-Russian protesters in Ghana through coordinated copy-pasting across multiple platforms.

The third strategy involves examining the algorithms themselves, often through reverse engineering. The Wall Street Journal created 100 TikTok accounts to test how different actions affected content recommendations. ProPublica demonstrated Facebook’s algorithmic failures by successfully placing discriminatory ads that should have been blocked under fair housing laws.

Some of the most impactful investigations come from insiders or whistleblowers. Insider reporter Tekendra Parmar investigated the death of an Ethiopian professor who was murdered after hateful, false content about him spread on Facebook. Parmar interviewed several participants in Facebook’s Trusted Partner program who revealed that the company “routinely ignored their warnings of hateful content,” including alerts about the disinformation that led to the professor’s death.

As these algorithmic systems continue to shape our information environment, journalists face both technical and ethical challenges in exposing their impacts. Whether using sophisticated data analysis or straightforward experimental methods, creative approaches remain essential for holding these powerful and opaque systems accountable.

Verify This Yourself

Use these professional tools to fact-check and investigate claims independently

Reverse Image Search

Check if this image has been used elsewhere or in different contexts

Ask Our AI About This Claim

Get instant answers with web-powered AI analysis

👋 Hi! I can help you understand this fact-check better. Ask me anything about this claim, related context, or how to verify similar content.

Related Fact-Checks

See what other fact-checkers have said about similar claims

Loading fact-checks...

Want More Verification Tools?

Access our full suite of professional disinformation monitoring and investigation tools

16 Comments

  1. Robert F. Johnson on

    As someone who closely follows the mining and commodities sector, I’m interested in how social media algorithms might be influencing the discussion and perception of these industries. Do you think there are opportunities for investigative reporting in this space?

    • That’s a great point. Mining and commodities are such important sectors, so it would be valuable to understand how social media algorithms might be shaping the narrative around them. Digging into potential biases or distortions could yield some fascinating insights.

  2. This guide highlights the critical need for transparency and accountability when it comes to social media algorithms. As an investor in mining and energy stocks, I wonder if there are ways these algorithms could be impacting market sentiment and stock prices in those industries.

    • Ava F. Williams on

      That’s an insightful observation. The potential influence of social media algorithms on financial markets is an area that definitely deserves closer scrutiny. Uncovering any manipulative or distortive effects could have major implications.

  3. I’m glad to see investigative journalism focused on this important issue. As someone with a background in data science, I’m curious about the technical approaches journalists are using to reverse-engineer these algorithms. Do they rely on crowdsourcing, simulations, or other innovative methods?

    • Lucas D. Moore on

      That’s a great question. The technical aspects of algorithm investigation are fascinating. I’d be really interested to learn more about the specific methods and tools journalists are employing, especially any novel approaches that leverage data science expertise.

  4. Elijah I. Taylor on

    I’m really curious to learn more about how social media algorithms might be influencing discussions and perceptions around topics like mining, commodities, and energy. These are such important sectors, so any biases or distortions introduced by algorithms could have significant consequences.

    • Absolutely. The potential impacts on financial markets, public discourse, and even policy decisions are concerning. Rigorous investigative reporting in this area could yield invaluable insights and spur much-needed reforms.

  5. Linda K. Taylor on

    Fascinating look at the opaque world of social media algorithms. It’s concerning how these complex systems can have such a profound impact on society without much transparency or accountability. I’m curious to learn more about the investigative techniques journalists are using to shed light on this issue.

    • Lucas V. Martin on

      Yes, the lack of transparency is a real challenge. Reverse-engineering approaches and focusing on demonstrable harms seem like smart strategies for journalists to uncover the inner workings of these algorithms.

  6. Isabella Moore on

    As someone who closely follows the mining and energy industries, I’m eager to see more investigative work on how social media algorithms might be shaping the narrative around these sectors. Transparency and accountability are critical, especially given the importance of these industries to the global economy.

    • Amelia Hernandez on

      I agree, this is a really important issue that deserves deeper exploration. Uncovering any biases or distortions introduced by social media algorithms could have significant implications for investors, policymakers, and the general public.

  7. Elizabeth Miller on

    Fascinating look at the complex world of social media algorithms and the challenges journalists face in investigating them. I’m particularly interested in the potential impacts on discussions and perceptions around industries like mining, commodities, and energy. This is an area that deserves more scrutiny.

    • Well said. Increased transparency around these algorithms is critical, especially for sectors that are so vital to the global economy. I’m eager to see the results of more investigative reporting in this space.

  8. This is a timely and critical topic. As the world becomes increasingly reliant on social media, the need for transparency and accountability around algorithms is paramount. I hope this guide inspires more investigative reporting to shed light on these opaque systems and their real-world impacts.

    • Oliver Johnson on

      Well said. Increased transparency is essential, especially for industries like mining and energy that are so vital to the global economy. Journalists have a crucial role to play in holding these algorithms accountable and uncovering their effects.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved. Designed By Sawah Solutions.