Listen to the article
Disinformation Now “Persistent Feature” of Digital Landscape, Expert Says
Disinformation has evolved from isolated viral falsehoods into a persistent feature of our digital environment, according to Yuliia Dukach, Head of Disinformation Investigations at OpenMinds. In a wide-ranging interview, Dukach explains how contemporary influence operations have been transformed by platform architectures, data access, and automated systems.
Drawing on her extensive investigative work in Ukraine, one of the most heavily targeted information environments globally, Dukach describes how enduring narratives attach themselves to current events and how social platforms condition behavioral impact.
“What tends to change is not the underlying logic, but rather the specific news hook, the content production script, and, gradually, the technologies and channels of dissemination,” Dukach explains.
Ukraine has been resisting Russian information operations since at least 2014, following the annexation of Crimea and the start of conflict in the Donetsk and Luhansk regions. This experience has given Ukrainians a clearer understanding of core narratives and strategic objectives behind these efforts.
“The better we understand these persistent narratives, the easier it becomes to anticipate which events they are likely to be attached to,” she notes.
While Russia conceptualizes its activities as “information war” implying state-to-state confrontation, Dukach argues this is not merely metaphorical. In Ukraine, information operations frequently ran parallel to physical occupation, with military units establishing networks of local Telegram channels to legitimize control of territory.
“‘Information war’ is not a metaphor but a component of military conflict,” she says. This logic increasingly extends beyond Ukraine, with information operations and cyber attacks accompanied by subversive activities across Europe, Africa, and Latin America.
Platform architecture plays a critical role in amplifying these efforts. Social media enables highly granular targeting, making behavioral influence more feasible by allowing messages to be tuned to specific vulnerable groups.
Dukach cites a 2020 incident in the Ukrainian town of Novi Sanzhary, where protests erupted against COVID-19 evacuees placed in local quarantine. The unrest was amplified by unfamiliar administrators of newly created local Viber groups and anonymous accounts. “The operation succeeded because it was locally embedded, focused on a highly uncertain and emotionally charged issue, and timed precisely,” she explains.
The rise of generative AI has dramatically transformed the landscape. What were once easily detectable Russian bot networks now deploy sophisticated tactics that make them nearly indistinguishable from genuine users.
“Today, these bots look very different. They post in sync with publication times, produce unique comments at scale because generative language models have replaced large teams of human operators, making content production faster and significantly cheaper,” Dukach says.
These systems now take as input not just the narrative to promote, but also the original post and full comment thread, allowing them to generate highly adaptive responses that integrate seamlessly into discussions. “At this point, such bots are no longer distinguishable ‘by eye’ — they can only be identified through large-scale data analysis and behavioral indicators of coordination.”
The biggest challenge for researchers remains platform data access. While Telegram, which became the primary source of news for over 80% of Ukrainians after 2022, offers an open API that can be systematically monitored, platforms like Facebook have made even aggregated data increasingly difficult to obtain.
“In Ukraine, around half of the adult population used Facebook for communication in 2025. Yet there is very little data-driven research on influence operations on the platform — not because such operations do not exist, but because Facebook has never provided Ukrainian researchers with access to raw, non-aggregated data,” Dukach notes.
Despite heightened awareness of disinformation in Ukraine, misconceptions persist. Many recognize disinformation as a serious issue but don’t see themselves as personally affected. “It is often perceived as a problem for ‘other’ — less informed or more naïve — audiences,” says Dukach.
Looking ahead, Dukach identifies two significant challenges: the growing scale of inauthentic behavior enabled by generative AI, and the concentration of data by global commercial platforms.
“How do we distinguish genuine voices from synthetic ones, especially in environments where personal account verification or identity-based solutions carry serious risks in fragile or non-consolidated democracies?” she asks.
Ukraine’s resilience has come not from institutional capacity alone, but from a dense ecosystem of civil society organizations connected to government institutions. This triangle of media, government centers, and non-governmental organizations—with support from the international community—offers a potentially transferable model for other regions facing similar threats.
“What matters is not institutional strength on paper, but the ability of different actors to coordinate, share information, and respond quickly across sectors,” Dukach concludes.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
Ukraine’s experience in resisting information operations provides valuable lessons. Their insights into core narratives and strategic objectives could inform more effective countermeasures elsewhere.
Curious to learn more about the specific data access and automated systems that have transformed disinformation tactics. Understanding the technical capabilities fueling these campaigns is key to developing robust countermeasures.
The challenges outlined here underscore the importance of international cooperation in addressing disinformation. Shared learnings and coordinated responses will be key to staying ahead of malicious actors.
Agreed. No single entity can tackle this alone. Leveraging global expertise and aligning policies/practices across borders will be critical to disrupting the infrastructure of disinformation campaigns.
Interesting insight on the evolving nature of disinformation. It’s critical that we stay vigilant and continue developing effective counter-strategies as these tactics adapt to new technologies and platforms.
Absolutely. Combating disinformation requires a multifaceted approach – from improving platform transparency to bolstering media literacy. We must remain proactive in the face of this persistent challenge.
Algorithmic amplification is a concerning vulnerability that platform providers must address. Transparency around content moderation and distribution practices is essential for building public trust.
Agreed. Empowering users with more control over their information feeds is also important. Giving individuals the ability to customize and curate their online experience can help mitigate the effects of algorithmic biases.