Listen to the article
The rise of AI-powered propaganda operations represents a seismic shift in how influence campaigns are conducted globally, according to cybersecurity expert Dr. Lukasz Olejnik. These automated systems now enable both state and non-state actors to deploy sophisticated influence operations using readily available consumer technology.
Unlike traditional propaganda efforts that required substantial human resources, these new AI propaganda factories can maintain consistent personas across multiple platforms, operate continuously without breaks, and autonomously adapt to different conversational contexts. This automation fundamentally changes the economics and scale of disinformation campaigns.
“What we’re seeing is a democratization of capabilities that were once limited to well-resourced intelligence agencies,” explains a security researcher familiar with the technology. “Today’s locally-run AI systems allow almost anyone with basic hardware to create and manage multiple politically or ideologically aligned personas that appear convincingly human.”
Dr. Olejnik’s research identifies three key vulnerabilities these systems exploit: technological infrastructure, institutional weaknesses, and human psychological tendencies. His empirical measurements of AI system performance confirm that fully automated influence operations have already moved beyond theoretical concerns to practical reality.
This development marks a significant turning point in the threat landscape. Previously, major influence campaigns required substantial investment in human operators and coordination. Now, similar operations can be conducted using consumer-grade hardware and publicly available AI models, dramatically lowering the barriers to entry.
Western democracies face particularly challenging scenarios due to this asymmetric threat environment. The open nature of these societies creates numerous attack vectors that can be exploited by a growing number of actors, while defensive capabilities remain limited and unevenly distributed.
“The fundamental challenge is that attacking is becoming exponentially easier while defending remains difficult,” notes a cybersecurity policy expert at a major think tank. “This creates a dangerous imbalance that our current governance structures weren’t designed to address.”
The implications extend beyond individual disinformation campaigns. As these AI propaganda systems proliferate, they could potentially overwhelm content moderation systems, fragment public discourse, and erode trust in information sources. During elections or crises, such systems could potentially flood information channels with misleading content tailored to exploit societal divisions.
Dr. Olejnik’s work highlights the urgent need for new approaches to counter these threats. Among the most pressing requirements are improved detection frameworks that can identify AI-generated content, crisis coordination mechanisms that allow rapid responses to emerging campaigns, and governance approaches that can effectively regulate these technologies without stifling innovation or limiting free expression.
Industry experts suggest potential countermeasures might include watermarking requirements for AI-generated content, advanced authentication systems for online identities, and improved media literacy education to help citizens better evaluate information sources.
Dr. Olejnik brings significant expertise to this analysis. As a Visiting Senior Research Fellow at King’s College London’s Department of War Studies, former cyberwarfare advisor at the International Committee of the Red Cross, and former member of the W3C Technical Architecture Group, he has worked at the intersection of technology, security, and policy for years. His books “Philosophy of Cybersecurity” and “Propaganda: From Disinformation and Influence to Operations and Information Warfare” provide comprehensive examinations of these evolving threats.
As AI systems continue advancing, the window for establishing effective governance frameworks may be narrowing. Technology companies, governments, and civil society organizations face increasing pressure to develop coordinated responses that balance security concerns with democratic values.
“The technological capability to create persuasive automated propaganda already exists,” warns a digital rights advocate. “The question now is whether our societal defenses can evolve quickly enough to maintain information integrity in this new environment.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


26 Comments
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Production mix shifting toward Propaganda might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Interesting update on AI-Powered Propaganda Networks Used to Shape Public Opinion. Curious how the grades will trend next quarter.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Interesting update on AI-Powered Propaganda Networks Used to Shape Public Opinion. Curious how the grades will trend next quarter.
Good point. Watching costs and grades closely.
Production mix shifting toward Propaganda might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Propaganda might help margins if metals stay firm.
Good point. Watching costs and grades closely.