Listen to the article
Northeastern Researchers Gain Unprecedented Access to Facebook Data to Study Fake News Spread
The global rise of fake news and misinformation continues to disrupt democracies worldwide, influencing elections in the United States and India while fueling civil unrest in countries like Sudan. As this digital epidemic spreads, researchers have long debated how misinformation proliferates—but have lacked the data to draw definitive conclusions.
Now, a groundbreaking research initiative at Northeastern University aims to solve this puzzle. Four professors from the university have been granted rare access to closely guarded Facebook data that could reveal crucial insights into how fake news circulates on social media platforms.
“It’s very exciting,” says Nick Beauchamp, assistant professor of political science at Northeastern who leads the university’s research team. “Facebook is the 800-pound gorilla, and the opportunity to work with their data in a way that’s ethical and secure is an exciting one.”
The interdisciplinary research team includes political scientist David Lazer, economist Donghee Jo, and computer scientist Lu Wang from Northeastern, along with Kenneth Joseph from the State University of New York at Buffalo. They join the first cohort of researchers worldwide granted permission to analyze Facebook’s internal data for insights into social media’s impact on democratic processes.
The team will gain access to three valuable datasets from Facebook. The first includes information from public accounts on both Facebook and Instagram, allowing researchers to track how news items gain popularity across these platforms. The second contains data on political advertisements that ran in several countries, including the United States, United Kingdom, Brazil, India, Ukraine, Israel, and the European Union. The third dataset provides information about specific URLs shared by at least 100 unique Facebook users.
With these resources, the researchers aim to construct a comprehensive map tracing fake news posts back to their origins—potentially resolving a fundamental question about misinformation’s spread.
“We know there’s a problem with fake news,” Beauchamp explains. “What we don’t know is whether it’s a problem of the institutions and the moment in which we’re living, or if it’s a problem that evolved from peer-sharing.”
The conventional theory suggests that social media platforms have disrupted traditional news dissemination by allowing unverified information to spread directly from person to person, bypassing the vetting processes of established media organizations. The Northeastern study will finally put this theory to a rigorous test.
The research could reveal two distinct patterns. One possibility is that fake, misleading, or ideologically extreme news primarily originates from established media companies and is pushed to users through corporate accounts or algorithmic recommendations—placing responsibility on institutional actors. Alternatively, researchers might find that misinformation spreads predominantly through peer-sharing networks, indicating that the breakdown of traditional media gatekeeping has fundamentally altered how information circulates.
A key advantage for the researchers is Facebook’s 2018 algorithm change, which shifted priority from publisher content to posts from friends and family. By analyzing fake news propagation before and after this pivotal change, the team can determine whether people or algorithms are more responsible for misinformation’s spread.
“From my point of view, the content on Facebook tends to be rich and robust, and speaks to the actual demographics of the U.S.,” notes Beauchamp. “It’s always been something that’s interested us; we just haven’t had access to it until now.”
The research comes at a critical moment for Facebook, which has faced intense scrutiny for its role in various democratic crises—from allowing Cambridge Analytica to access millions of users’ private data to failing to block hundreds of fake accounts from running ads designed to influence U.S. elections in 2016 and 2018.
The project is funded through a grant program established by the Social Science Research Council and Social Science One, specifically designed to help scholars study Facebook’s impact on global democracy. This unprecedented collaboration between academic researchers and the social media giant represents a potential turning point in understanding—and eventually combating—the fake news phenomenon that continues to threaten democratic institutions worldwide.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


14 Comments
Kudos to the Northeastern team for tackling this crucial issue. Quantifying the mechanics of misinformation spread is the first step toward developing effective solutions.
Agreed. Leveraging Facebook’s data in an ethical manner could yield invaluable insights that inform policies and strategies to address this threat.
Misinformation is a serious threat to democracy and social cohesion. I’m hopeful the Northeastern study will provide a data-driven roadmap for addressing this issue.
This is a timely and crucial investigation. Quantifying how misinformation spreads on social media is the first step toward developing effective countermeasures. I look forward to the findings.
Fascinating research on how misinformation spreads on social media. Access to Facebook data could provide critical insights to address this growing challenge. I’m curious to see what the Northeastern team uncovers.
Agreed, this is an important issue that demands rigorous, data-driven analysis. Ethical use of proprietary platform data could yield invaluable findings.
This is a landmark research initiative that could have far-reaching implications. I’m eager to see the findings and how they might inform efforts to combat the spread of misinformation.
Exciting to see researchers gaining access to Facebook data to investigate misinformation. This could be a game-changer in understanding and combating the spread of fake news.
Northeastern’s research team is taking on a major challenge with this project. Understanding the mechanics of fake news propagation is crucial for devising strategies to limit its influence.
Agreed. Gaining access to Facebook’s internal data is a significant achievement that could yield invaluable insights into this complex problem.
This research initiative is timely and much needed. Studying the dynamics of fake news spread on social media platforms is a critical step toward finding solutions.
Absolutely. Ethical use of proprietary data can lead to impactful findings that benefit society. I’m looking forward to seeing what the team uncovers.
Glad to see researchers collaborating with Facebook to study the dynamics of misinformation. Tackling the root causes is crucial, as fake news has far-reaching societal impacts.
Absolutely. Misinformation erodes public trust and undermines democratic institutions. Robust, evidence-based solutions are needed to combat this threat.