Listen to the article

0:00
0:00

The Growing Pandemic of Disinformation in the Digital Age

Disinformation has expanded to pandemic proportions over recent decades, creating damaging social impacts on a global scale. Once primarily the tool of governments hostile to democracy, disinformation is now increasingly employed in domestic politics and for promoting toxic ideological agendas across societies worldwide.

The primary function of disinformation is twofold: to fracture existing beliefs and to implant false ones, effectively seducing victims to serve the agenda of those who produced it. Its explosive growth over the past three decades directly parallels the expansion of digital networks and the rise of ubiquitous social media platforms.

While mass distribution of disinformation is not new, its scale and reach have transformed dramatically in the digital era. Historically, Gutenberg’s printing press nearly six centuries ago led to large-scale exploitation of printed material for propaganda distribution. Less commonly known is that television was used for broadcasting Nazi propaganda as early as 90 years ago, shortly after the technology’s inception.

“History demonstrates that newer and faster media for distributing information will inevitably be exploited for both beneficial and harmful purposes,” experts note.

Digital Amplification Creates Unprecedented Reach

For purveyors of disinformation—whether rogue states, political groups, ideological movements, or activists promoting questionable causes—the digital age has created unprecedented opportunities to spread false narratives.

The widespread access to fast networks via smartphones and computers allows direct distribution to audiences numbering in the billions. Meanwhile, advancing technologies like artificial intelligence enable “microtargeting” with individually customized narratives and the production of increasingly convincing “deepfakes” designed to manipulate audiences.

Scientific study of disinformation has lagged behind its proliferation. While deception has been well-studied in the humanities, robust mathematical models for analyzing deception only emerged 25 years ago and remain not widely known or utilized in mainstream discourse.

Understanding the Science of Deception

Disinformation represents an instance of deception framed within a social context, frequently associated with governments and political discourse. While the term “misinformation” is often used interchangeably, current literature typically identifies disinformation as intentional deception, while misinformation is considered incidental or accidental. Both are commonly labeled as “fake news.”

The Borden-Kopp model, developed independently by mathematician Andrew Borden and computer scientist Carlo Kopp approximately 25 years ago, provides the simplest mathematical model that accurately captures deception. Based on Shannon’s information theory, this model describes four distinct methods through which deceptions can be carried out:

  1. Degradation: Hiding facts by burying them in background noise—exemplified in nature by camouflage and concealment. In media, this occurs through flooding audiences with irrelevant messages to distract from important information.

  2. Corruption: Using mimicry, where a fake is made to appear sufficiently similar to something real that victims are tricked into false beliefs. Modern “deepfakes” represent clear examples of this approach.

  3. Denial: Rendering information channels temporarily or permanently unusable—comparable to squids squirting ink to blind predators in nature. Contemporary examples include “deplatforming” influencers or blocking websites.

  4. Subversion: The most sophisticated form of deception, where victims’ information processing is manipulated to advantage the attacker. This commonly involves altering how victims interpret situations or facts, similar to how “spin doctors” operate in political discourse.

In practice, compound deceptions that combine several of these methods are commonly deployed to maximize effectiveness.

Cognitive Vulnerabilities

Individual vulnerability to deception varies significantly based on how people process what they perceive. Naturally skeptical individuals who critically assess observations can often unmask deceptions. Conversely, those who are naive or unwilling to critically evaluate information remain highly susceptible.

Human cognition evolved primarily as a survival mechanism, often favoring solutions that minimize effort and reaction time over accuracy. This tendency toward cognitive shortcuts creates exploitable vulnerabilities that skilled deceivers readily target.

Common cognitive vulnerabilities include confirmation bias, motivated reasoning, the Dunning-Kruger effect, and numerous other identified cognitive biases that disinformation campaigns routinely exploit.

The Viral Spread of Falsehoods

Research into why deceptive messages spread efficiently has identified two primary factors. First, deceptive messages can be crafted to seduce audiences with known cognitive biases—a technique widely employed in politically polarized media. Truthful messages typically prove much less enticing than carefully constructed falsehoods.

Second, the nature of social networks, especially when connected by pervasive high-bandwidth digital networks, facilitates rapid propagation of disinformation. Analysis of traffic on social media platforms and extensive simulation modeling shows that deceptive messages typically spread in patterns well-described by epidemiological models used in medicine. Social media “influencers” often serve as “superspreaders” of disinformation.

After decades of relative neglect, scientific study of disinformation—how it operates, seduces victims, and propagates—is gaining momentum. As with many scientific fields, the most significant advances in understanding and potentially countering disinformation may still lie ahead.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

11 Comments

  1. Elizabeth Smith on

    This is a timely and important topic. The history of disinformation shows how new media have repeatedly been exploited for propaganda. The scale and speed of today’s digital landscape have taken the problem to a whole new level. Developing effective countermeasures is critical for safeguarding democracy.

    • Elizabeth Martin on

      Agreed. Combating disinformation will require a concerted, multi-stakeholder effort involving policymakers, tech companies, journalists, and the public. Investing in media literacy is a good place to start.

  2. The article highlights how disinformation has evolved alongside technological advances over centuries. It’s alarming to see how quickly and pervasively false narratives can now proliferate online. Addressing this issue will require a multifaceted approach targeting both the supply and demand sides.

  3. Elizabeth Martin on

    This is a sobering look at the serious challenges posed by disinformation in the digital age. The historical parallels are eye-opening – it’s clear that new technologies have repeatedly been misused for propaganda purposes. Addressing this issue will require a multifaceted approach targeting both supply and demand.

    • Liam Rodriguez on

      Agreed. Improving media literacy and strengthening platform policies and content moderation will be key to curbing the spread of disinformation online.

  4. Interesting look at the evolution of disinformation over the centuries. It’s concerning how digital platforms have amplified the problem, making it easier than ever to spread falsehoods on a massive scale. We’ll need new strategies to counter this growing threat to democracy and social cohesion.

    • William K. Smith on

      Agreed. Combating disinformation is one of the biggest challenges we face in the digital age. Improving media literacy and fact-checking will be crucial.

  5. Isabella Davis on

    This is a sobering reminder of how quickly and widely disinformation can spread online. The historical context is helpful to understand how new technologies have repeatedly been misused for propaganda. Staying vigilant and developing effective countermeasures will be vital going forward.

    • John Rodriguez on

      Indeed, the rise of social media has supercharged the spread of misinformation. Developing better platform policies and user education will be key to limiting the damage.

  6. The article highlights the worrying evolution of disinformation tactics over time, from the printing press to social media. It’s clear that new technologies have consistently been exploited for propaganda purposes. Developing effective countermeasures to address this threat will be critical for preserving democratic norms and institutions.

  7. Elizabeth Jackson on

    The article provides valuable historical context on the evolution of disinformation tactics. It’s alarming to see how the problem has escalated with the rise of digital and social media. Developing robust strategies to address this threat will be crucial for maintaining social and political stability.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.