Listen to the article
Research Reveals Significant Gap in Understanding Data-Based Misinformation
Data-driven communication dominates modern news and public discourse, yet the role of numbers in spreading misinformation remains poorly understood. A new academic review finds that misleading statistics and visuals are widely used to influence opinion but are rarely treated as misinformation in their own right.
The findings appear in “Data in the Context of Misinformation: A Scoping Review,” published in Journalism & Mass Communication Quarterly. The paper compiles decades of research on how people interpret, trust, and misread numerical and visual data, revealing a critical blind spot in misinformation research.
Most scholarship still treats false or misleading information primarily as a text-based problem. Fabricated claims, deceptive headlines, and false narratives dominate academic attention, while numbers and visuals are often relegated to secondary status or examined only as technical details. In many studies, misleading statistics or charts appear merely as examples rather than as the central focus of analysis.
This narrow approach is particularly troubling given how frequently data appear in real-world misinformation. Public debates on health, climate change, economics, and elections routinely feature graphs, percentages, and numerical comparisons that shape perceptions of risk, urgency, and credibility long before audiences evaluate accompanying text.
“When misleading data aren’t recognized as misinformation, they may escape scrutiny by journalists, fact-checkers, and platforms that prioritize textual accuracy over numerical or visual integrity,” notes the review. This gap becomes especially dangerous during crises such as pandemics or natural disasters, when rapidly changing data directly influence public behavior and policy decisions.
Instead of being studied as misinformation, misleading data are typically examined through separate disciplinary lenses. Design researchers focus on poor visualization practices like truncated axes or unclear labels. Psychologists explore how people misinterpret numbers or struggle with probability. Statisticians examine errors in data reporting or analysis. While each approach offers valuable insights, their separation prevents a comprehensive understanding of how data function within broader misinformation ecosystems.
As a result, fundamental questions remain largely unanswered: How often does misinformation rely on data compared to text-only claims? Do data-based falsehoods spread differently or persist longer? Are audiences more likely to trust misinformation when it includes numbers or charts?
The review synthesizes findings using a dual-process model of cognition to explain why data-based misinformation can be so powerful. Data often exert their strongest influence during the earliest stages of information processing, when people rely on mental shortcuts rather than careful analysis.
Visual features such as bar heights, line slopes, or color contrasts draw immediate attention. Numerical anchors, like a striking percentage or large figure, set reference points that shape subsequent judgments. Research shows that misleading visual cues and numerical framing can distort perception almost instantly, creating impressions that persist even when people engage in more reflective thinking.
“Corrections and clarifications often struggle to fully undo the influence of misleading numbers or visuals,” the review states. “Early anchors continue to shape judgments, helping explain why retracted statistics or corrected charts leave lasting impressions.”
Interpretation errors increase when data representations don’t align with familiar mental schemas. Unusual chart formats, logarithmic scales, or complex visualizations raise cognitive load and the risk of misunderstanding. While longer deliberation can improve accuracy in some cases, it can also introduce new errors when people lack numeracy or graph literacy.
Interestingly, the review shows that data are not automatically trusted. Public skepticism toward statistics has grown in recent years, especially regarding politicized numbers. However, mistrust doesn’t eliminate data’s persuasive power. Credibility judgments depend heavily on how well data align with existing attitudes and expectations. When numbers support prior beliefs, they’re more likely to be accepted and remembered, even if misleading.
This interaction between data and motivated reasoning creates a paradox: highly numerate individuals may better detect errors but can also more skillfully interpret numbers to defend preexisting views. The review finds mixed evidence on whether higher numeracy consistently reduces susceptibility to misinformation, underscoring the complexity of data-driven persuasion.
The findings highlight significant implications for journalism and public communication. Journalists translate complex data into accessible stories, yet their role in creating or amplifying misleading data is rarely examined directly. Design choices, simplifications, and framing decisions made under deadline pressure can unintentionally distort meaning, even when underlying data are sound.
The research suggests that preventing data-based misinformation requires more than after-the-fact fact-checking. It demands stronger data literacy among journalists, clearer standards for visual and numerical reporting, and greater transparency around data sources and methods.
As data continue to dominate public discourse, addressing these gaps becomes increasingly urgent. Without better understanding of how misleading numbers and visuals operate, efforts to combat misinformation will remain incomplete and potentially ineffective.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
The findings about the lack of academic focus on data-driven misinformation are quite alarming. If we don’t fully understand the mechanics behind this problem, how can we hope to effectively combat it? This is an important wake-up call for the research community.
Agreed. Expanding the scope of misinformation research to include numerical and visual elements is a crucial next step. Interdisciplinary collaboration between data scientists, journalists, and communication scholars could yield valuable insights.
As someone with a background in data analysis, this topic really resonates with me. I’ve seen firsthand how easily data can be twisted or misinterpreted to push certain narratives. Rigorous scrutiny of data sources and visualization methods is essential.
Absolutely. Responsible data reporting requires transparency, context, and a commitment to objectivity. Findings like these highlight the need for better media literacy and critical thinking around numerical and visual information.
This is a concerning issue that deserves closer examination. Data visualization can be a powerful tool, but it can also be misused to mislead and manipulate. I’m curious to learn more about the specific tactics and techniques identified in this research.
I agree, the role of data and visual representation in the spread of misinformation is an important but often overlooked aspect. Understanding these dynamics is crucial for combating the larger problem of misinformation.
As someone who values evidence-based decision-making, I find this topic deeply concerning. If people can’t trust the data and visualizations they encounter, it undermines the very foundations of rational discourse. This is a problem that deserves urgent attention.
This is a complex issue without easy solutions. Misinformation can take many forms, and data-driven tactics are just one piece of the puzzle. I’m curious to see what recommendations or best practices emerge from this research to address these challenges.