Listen to the article
In an era where digital manipulation increasingly shapes public opinion, a pioneering competition is challenging participants to understand the mechanisms behind social media disinformation campaigns.
From the 2016 US Presidential elections to COVID-19 and last year’s Australian federal election, the impact of disinformation on social media platforms has grown exponentially, fundamentally altering how people form opinions and make decisions.
The threat landscape continues to evolve rapidly. Fake AI-generated videos depicting global conflicts and even Donald Trump’s trade policies have demonstrated how virtually no topic remains safe from social media manipulation and propaganda efforts.
Australia’s electoral integrity faces similar challenges. The Australian Electoral Commission, the nation’s independent body responsible for conducting federal elections, has acknowledged the near impossibility of preventing campaign misinformation during electoral periods.
“During a federal election in Australia, it is reasonable to expect that there could be AI used in election communication… to mislead voters,” the Commission stated, highlighting the growing concern among electoral authorities worldwide.
These concerns are being taken seriously at the highest levels of Australian politics. Just last week, the Australian Labor Party announced a comprehensive review of their strategy for the upcoming 2025 election campaign, with specific emphasis on addressing cyber misinformation and artificial intelligence threats that could potentially influence voter behavior.
In response to these mounting challenges, the UNSW Institute for Cyber Security has partnered with the School of Computer Science and Engineering to launch “Capture the Narrative” – described as Australia’s most innovative AI-based cybersecurity competition to date.
The event represents a global first: a simulated social media manipulation competition where participants work in teams to shape dominant narratives on a fictional social media platform. Competitors chase likes and followers to demonstrate how misinformation can spread and influence simulated electoral processes.
“This competition is not like other Capture the Flag competitions where the main purpose is to be a good penetration test,” explained Dr. Rahat Masood, co-lead of the competition. Instead, participants will exploit AI and large language models to understand their impact in propagating misinformation across social media environments.
The competition’s ultimate goal extends beyond mere technical demonstration. “We want to make students knowledgeable about misinformation, disinformation, how it propagates, and what is the impact and influence of it at a higher level – for example – politics and propaganda,” Dr. Masood added.
This initiative comes at a critical juncture when technology companies, governments, and civil society organizations worldwide are grappling with how to maintain information integrity in digital spaces. Recent years have seen social media platforms implementing varying degrees of content moderation and fact-checking initiatives, though critics argue these measures have proven insufficient against sophisticated disinformation campaigns.
The competition also reflects growing academic interest in studying disinformation not just as a technical problem, but as a complex socio-technical phenomenon with far-reaching implications for democratic processes and social cohesion.
By creating a controlled environment where participants actively engage in narrative manipulation, “Capture the Narrative” aims to develop a deeper understanding of how misinformation operates – knowledge that could prove invaluable for developing more effective countermeasures in real-world contexts.
As Australia and other democracies prepare for future elections in an increasingly AI-powered information landscape, initiatives like this one may provide crucial insights into protecting electoral integrity against ever more sophisticated digital threats.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


31 Comments
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
Uranium names keep pushing higher—supply still tight into 2026.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.