Listen to the article
Australian inquiry reveals systemic climate misinformation undermining public debate
Australia’s first parliamentary inquiry into climate and energy misinformation has uncovered a systematic problem distorting public discourse, though its most consequential solutions remain outside the main recommendations.
The inquiry, which spanned 11 days of hearings and processed hundreds of submissions, documented a growing “information integrity gap” that is delaying climate policy implementation, eroding trust in scientific institutions, and posing a significant threat to Australia’s democratic processes.
Evidence presented to the committee revealed that climate misinformation isn’t merely occurring as isolated incidents but represents a sophisticated, coordinated effort. Multiple submissions detailed “astroturfing” campaigns—where organizations create the illusion of grassroots support—including fake social media accounts impersonating Australians to manufacture opposition to renewable energy projects.
Experts testified that digital platforms are amplifying false and misleading claims through opaque algorithms, with the rise of artificial intelligence accelerating the production of deceptive content. The inquiry’s findings suggest this problem transcends occasional falsehoods to become a structural feature of Australia’s information ecosystem.
While the main report recommends a broad, whole-of-government approach to improving information integrity—including adopting international frameworks, increasing funding for regulators and research, improving transparency around political influence, enhancing digital literacy, and strengthening oversight of digital platforms—critics argue these high-level recommendations lack the specificity needed for meaningful change.
The report’s most concrete and ambitious solutions appear not in the main body but in additional comments from senators across the political spectrum. A majority of committee members—including Greens chair Peter Whish-Wilson, two Labor senators, independent David Pocock, and Liberal Andrew McLachlan—acknowledged that the main report falls short of recommending necessary structural reforms.
Among the most pressing solutions identified in these supplementary sections is the need for truth in political advertising laws. “Australians continue to see misleading political advertising deployed with impunity,” wrote Pocock and McLachlan, echoing concerns from organizations like the Centre for Public Integrity, the Australian Conservation Foundation, and the Climate Council.
The senators also highlighted the absence of enforceable regulations governing “inauthentic behavior” online. The inquiry heard direct evidence of coordinated campaigns designed to manipulate public perception, including testimony from Farmers for Climate Action about fake social media profiles creating an illusion of widespread opposition to renewable energy.
Australia’s current regulatory approach relies heavily on voluntary industry codes and lacks enforceable obligations regarding false content. There are no clear legal requirements for platforms to remove bot accounts or label automated content—a gap that experts warned will become increasingly dangerous as AI technologies advance.
The Greens’ additional comments took direct aim at the fossil fuel industry, asserting that it “knew, lied, and denied catastrophic climate change, and then sabotaged climate action for decades, all the while raking in billions of dollars in profits every year.” Their proposals include a real-time public register of political advertising, potential bans on fossil fuel advertising, stronger disclosure requirements for online advertisers, expanded lobbying rules, and enhanced regulatory oversight of digital platforms.
The inquiry’s cautious approach in its main report reflects divided committee politics. Three conservative senators issued dissenting reports not only rejecting the recommendations but questioning the inquiry’s legitimacy. Nationals Senator Matt Canavan described it as “an attempt to bully and cajole people into silence,” while One Nation’s Malcolm Roberts called it “the worst senate inquiry I have experienced.”
Despite these limitations, the inquiry represents a significant milestone as possibly the first parliamentary investigation worldwide to examine information integrity in climate and energy—a problem the United Nations recognizes as a major barrier to effective climate action.
The inquiry has documented compelling evidence that misinformation and disinformation are structural features of today’s information ecosystem, amplified by digital platforms, political incentives, and coordinated campaigns. While its main report offers only a partial roadmap for addressing these issues, the more ambitious, targeted solutions remain available in the appendices for those willing to look beyond the headlines.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


16 Comments
While disappointing the inquiry didn’t recommend major reforms, I’m curious to learn more about the specific policy proposals that were considered. Even incremental steps could make a meaningful difference.
This is a complex issue with implications for climate action, democratic institutions, and public trust. I appreciate the inquiry shining a light on the problem, but hope more consequential reforms are still on the table.
It’s disappointing the inquiry didn’t recommend more substantial reforms, given the evidence of systemic misinformation. Strengthening transparency and accountability should be key priorities moving forward.
I share your disappointment. Tackling this issue will require a multi-faceted approach focused on content moderation, algorithmic amplification, and addressing the underlying drivers of misinformation.
The findings of this inquiry are deeply troubling. Organized misinformation campaigns pose a grave threat to climate action and democratic processes. Substantive reforms are urgently needed to combat this problem.
I agree, the scale of the challenge requires a robust and comprehensive response. Policymakers, platforms, and the public all have a critical role to play in restoring trust and integrity.
The findings of this inquiry are deeply concerning. Organized misinformation campaigns erode trust and delay critical climate action. Reforms are urgently needed to address this growing threat.
This is a complex challenge with serious implications. While the lack of major reforms is concerning, I’m hopeful the inquiry will spur further investigations and policy proposals to address this threat to public discourse.
While the inquiry findings are disappointing, I’m curious to learn more about the specific recommendations that fell short of major reforms. What were the key gaps identified but not addressed?
That’s a good question. It would be helpful to understand the committee’s rationale for not proposing more substantive solutions, given the scale of the problem documented.
This is a troubling situation that undermines public discourse and climate action. Systemic misinformation efforts require a robust response to protect democratic processes and scientific institutions.
I agree, the scale of the problem calls for substantive solutions. Platforms, policymakers, and the public all have a role to play in combating this threat.
The evidence of organized misinformation campaigns and algorithmic amplification is deeply concerning. Tackling this will require a multi-pronged approach involving platforms, policymakers, and the public.
I agree, a comprehensive strategy is needed to address the root causes and stop the spread of climate misinformation. Transparency and accountability should be key priorities.
This is a concerning issue that undermines public discourse and climate action. Concerted misinformation efforts to sow doubt and delay policy are worrying. Transparency and accountability are needed to combat these sophisticated, coordinated campaigns.
I agree, the evidence of systemic climate misinformation is troubling. Platforms need to improve content moderation and algorithmic amplification to stop the spread of deceptive claims.