Listen to the article
AI Disinformation Emerges as Critical Threat, Health Group Calls for National Strategy
As 2026 begins, artificial intelligence-generated disinformation has become pervasive, eroding public trust in news media, electoral systems, and healthcare initiatives. According to recent findings by the Australian Communications and Media Authority (ACMA), a majority of Australians report encountering misinformation in their daily lives.
The World Economic Forum has ranked disinformation and misinformation as the second most severe short-term global risk, trailing only behind geopolitical economic confrontations. In response to this growing crisis, the French government recently unveiled a comprehensive approach to combat health-related disinformation, highlighting the international recognition of this issue.
Climate action initiatives have also fallen victim to coordinated online propaganda campaigns. This week, the ABC reported on Senate select committee hearings in Canberra focused on information integrity in climate change and energy discussions, where experts testified about the damaging impact of organized misinformation.
On the global stage, the United Nations has issued an urgent appeal for countries to implement regulatory measures protecting children from emerging threats like deepfake grooming and traumatic digital content. The UN specifically praised Australia’s pioneering social media ban for users under 16, a policy that has gained attention as eSafety investigators examine sexual deepfakes on Elon Musk’s AI platform Grok.
Despite these initiatives, Croakey Health Media argues current approaches lack the comprehensive framework needed to effectively address the problem. In their 2026-27 pre-budget submission filed on January 30, the organization has called on the Albanese Government to develop and fund a national, whole-of-government strategy to tackle what they describe as a “public health emergency.”
“The instability, unreliability and hazardous state of our news and information systems is undermining democracy, social cohesion, evidence-informed policy, economic and political stability, and the safety, health and wellbeing of communities,” Croakey stated in their submission.
The rapid advancement and adoption of artificial intelligence technologies has significantly amplified these risks. As Foreign Affairs Minister Senator Penny Wong told the UN Security Council in September 2025: “While once we grappled to discern fact from propaganda, we are now witness to a collapse of truth altogether. Content deliberately designed to deceive is now almost indistinguishable from reality.”
The World Economic Forum’s Global Risks Report 2026 draws direct connections between misinformation, growing social polarization, and rising authoritarianism. The report identifies addressing misinformation and disinformation as essential to rebuilding institutional trust and reducing societal divisions.
In Australia, health authorities have witnessed firsthand how misinformation undermines support for evidence-based public health measures, including vaccination programs. Yet Croakey notes that Australia lacks an integrated, AI-focused nationwide strategy that coordinates actions across all portfolios and levels of government.
While the organization welcomes current initiatives such as ACMA’s co-regulatory role, media literacy programs, and the News Media Assistance Program, they argue these disconnected efforts are insufficient to address the scale of the problem.
Croakey is advocating for the establishment of a National Commission on Disinformation to coordinate a comprehensive response across federal, state, territory, and local governments, as well as civil society. Their proposed framework would include support for independent media growth, particularly in “news deserts” where local journalism has disappeared, and areas dominated by corporate media interests.
The organization envisions this strategy addressing disinformation beyond politics, tackling misinformation related to climate disruption, vaccine hesitancy, Aboriginal and Torres Strait Islander self-determination, and the undermining of democratic institutions.
In a positive development for Croakey, the organization has secured a $97,000 grant through the federal government’s Journalism Assistance Fund Program, which will be distributed over three years to support public interest journalism. While welcoming this funding, Croakey emphasizes that short-term grants do not provide sustainable solutions to the structural challenges facing independent media organizations.
Their pre-budget submission urges the government to allocate a portion of its advertising budget to members of the Local and Independent News Association (LINA), creating a more stable foundation for diverse media outlets working to combat the growing crisis of AI-powered disinformation.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
The rise of AI-generated disinformation is particularly alarming. It can be difficult for the average person to discern truth from fiction online. Robust fact-checking and public education initiatives will be essential to empower Australians to navigate the information landscape safely.
Absolutely. Regulators and policymakers need to stay ahead of technological advancements that enable the proliferation of false and misleading content. Collaborative solutions between government, tech platforms, and civil society will be crucial.
Disinformation can undermine critical public discussions around issues like climate change. It’s concerning to hear that organized misinformation campaigns are distorting the national dialogue on these important topics. A national strategy is sorely needed to uphold information integrity.
As an extractives-focused investor, I’m particularly concerned about the potential impact of disinformation on public perceptions of the mining industry. Factual, balanced information is crucial for informed decision-making around critical minerals and energy transition. A national strategy is sorely needed.
The global nature of this problem highlights the need for international coordination and best practice sharing. While each country may have unique challenges, there are likely common lessons and approaches that can be adopted. Australia should look to other nations’ experiences in combating disinformation.
Good point. Tackling disinformation requires a collaborative, cross-border effort. Sharing intelligence, strategies, and policy solutions will be key to building a more resilient information ecosystem worldwide.
This is a concerning issue that demands a coordinated national response. Disinformation can have serious consequences for public trust and democratic processes. It’s crucial that Australia develops a comprehensive strategy to identify, counter, and mitigate the spread of online falsehoods.
Agreed. Combating disinformation requires a multi-pronged approach involving media literacy, fact-checking, and international cooperation. Transparency and public awareness will be key to building resilience against manipulative narratives.