Listen to the article

0:00
0:00

The evolution of “disinformation” as a concept traces its roots back to Soviet Russia, where the term “dezinformatsiya” described false information designed to undermine public trust in the Communist Party. Originally a Cold War concept, the term gained widespread traction in America only after the 2016 presidential election, when Donald Trump’s unexpected victory over Hillary Clinton prompted a search for explanations.

Reports of Russian interference in the election helped establish “disinformation” as a dominant narrative in American political discourse. What began as a seemingly legitimate concern about foreign interference quickly expanded to include domestic political opponents and viewpoints that challenged progressive orthodoxy.

The concept’s expansion created an entire industry of “disinformation experts” at universities, non-profit organizations, and within tech companies. These experts wielded significant influence over what information could circulate on major social media platforms. Their scope continuously broadened—from targeting Russian election interference to suppressing alternative theories about COVID-19’s origins, questioning vaccine efficacy, or expressing traditional views on gender.

This framework spawned increasingly complex terminology. “Misinformation” referred to unknowingly false information, while “misleading information” described content that might guide people toward incorrect conclusions. Perhaps most troubling was the emergence of “malinformation”—factually accurate information deemed harmful if shared publicly, such as true accounts of vaccine side effects that might discourage vaccination.

Katherine Maher, now CEO of National Public Radio, articulated this shift when she stated in 2021: “Our reverence for the truth might be a distraction that’s getting in the way of finding common ground and getting things done.”

The disinformation framework inherently identified enemies—not foreign adversaries as in the Soviet model, but fellow citizens with opposing political views. This approach effectively recast millions of Americans as dangerous threats, further polarizing an already divided nation.

While direct government censorship would violate the First Amendment, evidence suggests significant pressure from public officials on tech platforms. Mark Zuckerberg has acknowledged receiving pressure from the Biden administration regarding COVID content and FBI warnings about the Hunter Biden laptop story, which Meta subsequently suppressed—a decision Zuckerberg now says he regrets.

The January 6, 2021, Capitol riot accelerated these efforts. Major platforms “deplatformed” then-President Trump and thousands of his supporters. As revealed in the “Twitter Files” following Elon Musk’s acquisition of the platform, sophisticated systems had been developed to monitor millions of posts daily with input from government-approved experts.

Ironically, those fighting “disinformation” sometimes engaged in spreading it themselves. The Hunter Biden laptop story represents perhaps the clearest example—a legitimate news report that was systematically suppressed as “Russian disinformation” by a coordinated effort involving media outlets, intelligence officials, and social media platforms, despite being factually accurate.

The war on disinformation has largely failed on its own terms. Attempts to suppress COVID “misinformation” eroded public trust in health authorities. Censorship of conservative viewpoints on gender issues intensified opposition to progressive policies. And the prolonged battle against “Trumpist disinformation” culminated in Trump’s strongest electoral performance in 2024.

Looking forward, several trends appear likely. The fact-checking and anti-disinformation industry will likely be dismantled or transformed as platforms cut ties and political pressure shifts. However, this may risk a McCarthyist backlash against individuals previously involved in content moderation.

Meanwhile, both government and corporate attempts to regulate digital speech will likely increase, especially outside the United States where First Amendment protections don’t apply. Platform owners like Zuckerberg and Musk will continue wielding significant influence over what speech is permitted.

Public response to increasing polarization has included “news avoidance” and reduced engagement on social platforms—a natural immune response that protects individuals but potentially harms democratic discourse by leaving only the most extreme voices in the conversation.

Finally, digital spaces are increasingly balkanizing along political, national, and demographic lines—from the split between Bluesky and Twitter/X to divergent regulatory approaches in Europe and America.

For now, media literacy remains the most viable approach for addressing digital misinformation without compromising democratic principles. As we enter the second decade of widespread social media usage, greater public awareness of online manipulation may be our strongest defense against the harmful effects of false information.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

8 Comments

  1. Patricia Z. Martin on

    As someone interested in mining and commodities, I wonder how the disinformation debate has impacted discussions around topics like renewable energy, battery metals, or the energy transition. Have alternative views on these subjects been unfairly dismissed as ‘disinformation’?

  2. Jennifer Lopez on

    Fascinating look at the evolution of disinformation and how it has become a catch-all term to discredit views that challenge mainstream narratives. Maintaining truth and trust in media is critical, but the scope of ‘disinformation’ seems to have expanded far beyond its original intent.

  3. James W. Garcia on

    This article raises important questions about how we define and combat disinformation in the digital age. I’m curious to hear more perspectives on striking the right balance between addressing genuine threats and preserving open discourse. It’s a nuanced issue without easy answers.

  4. Patricia Martinez on

    This issue of disinformation is a complex one. While it’s important to address the real problem of foreign interference, I worry the term has been weaponized to suppress legitimate debate and dissenting views. We need to find a balance between combating genuine misinformation and preserving free speech.

    • Oliver Thompson on

      I agree, it’s a fine line to walk. Discernment is key – we must be able to critically evaluate information from diverse sources rather than relying on a narrow set of ‘approved’ narratives.

  5. James V. Moore on

    This is a complex issue with no easy answers. I appreciate the nuanced perspective presented here. It’s important to find ways to combat genuine disinformation while preserving the free exchange of ideas and diverse viewpoints, even on sensitive topics like mining and energy.

  6. Emma I. Thompson on

    The rise of ‘disinformation experts’ is concerning. Are they truly impartial arbiters of truth, or do they have their own agendas? Healthy skepticism is warranted when any group wields significant influence over information flows, especially on social media platforms.

  7. Liam Hernandez on

    The expansion of ‘disinformation’ as a concept is worrying. While combating foreign interference is crucial, I fear the term has become a convenient tool to suppress views that challenge established narratives, even on technical subjects like mining and energy. We must remain vigilant.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.