Listen to the article

0:00
0:00

Australians increasingly struggle to navigate a digital information landscape dominated by opaque algorithms and AI-generated content, according to a new report from the University of Canberra’s News and Media Research Centre.

The report, released today, highlights how Australians are disengaging from traditional news sources in favor of social media, influencers, and AI chatbots, despite growing concerns about information reliability. This shift comes as local journalism continues to disappear across the country, creating information vacuums that are easily filled by misinformation.

“It’s a murky, polluted world where opaque algorithms decide what you see,” the researchers write, noting that these algorithms often prioritize engagement over accuracy or quality of information.

The situation has worsened with the emergence of “zero-click” AI search results that show information directly without requiring users to visit news websites. This trend further reduces traffic to news outlets, threatening their already precarious business models and accelerating the decline of professional journalism.

Earlier this year, the News Futures: Media Policy Roundtable gathered 45 leaders from various sectors to address these challenges. Participants identified algorithmic opacity—the hidden mechanisms that determine what content users see online—as a fundamental threat to journalism and public trust.

Research shows that Australians have remarkably low confidence in their ability to verify information. Only about 40% feel confident checking if a website or social media post is trustworthy, and just 43% believe they can determine if information they find online is accurate. Not surprisingly, Australians rank among the world’s most concerned populations regarding online misinformation.

“When everything starts to look unreliable, switching off can feel like the safest option,” the report states. This explains why 69% of Australians now avoid news either often, sometimes, or occasionally.

The problem extends beyond user behavior. Digital platforms act as unreliable gatekeepers of information, making invisible and unaccountable choices about content prioritization. These platforms typically have no incentive to explain how their algorithms work, how news is prioritized, or how AI-generated information is produced.

The proliferation of AI-generated content compounds these issues. AI systems frequently produce what experts call “AI slop” and “hallucinations”—low-quality and factually incorrect content that can be difficult to distinguish from reliable information.

To address these challenges, roundtable participants identified five priorities for improvement:

First, they called for greater transparency from technology platforms. Australians deserve to know how algorithms curate news on search engines, social media, and AI chatbots. Clear labeling of AI-generated content would help users make more informed choices.

Second, the experts advocated for fair rules governing AI’s use of news content. This includes industry-wide licensing agreements, copyright reform, and stronger competition laws to ensure news organizations are properly compensated when their work is used to train generative AI tools.

Third, they emphasized the importance of media and AI literacy education nationwide. Teaching people how algorithms work and how to identify bias and misinformation represents one of the most cost-effective interventions available.

Fourth, the report argues journalism funding should reflect its role as a public good. Rather than relying on one-off grants, sustainable alternatives such as tax offsets for journalists’ salaries could directly support newsrooms, particularly small and regional outlets.

Finally, the experts recommended journalism training for news influencers, content creators, and digital-first outlets, along with a common industry code to ensure quality across the entire news ecosystem.

“Society can’t afford an information environment in which invisible AI dictates what we see,” the researchers conclude. “Without action, the public interest journalism that underpins democracy and social cohesion will continue to crumble.”

The report represents a collaborative effort by University of Canberra researchers Sora Park, Janet Fulton, Momoko Fujita, and Saffron Howden, bringing together perspectives from academia, industry, government, and technology sectors to address one of the most pressing challenges to Australia’s information ecosystem.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

10 Comments

  1. Robert Y. Miller on

    The report’s findings on the decline of traditional news sources and the rise of social media and AI-generated content are quite troubling. Maintaining a well-informed citizenry is crucial for a healthy democracy. This is an issue that deserves serious attention and policy solutions.

  2. Elijah S. Lopez on

    The rise of social media influencers and AI chatbots as news sources is worrying. It’s crucial that people have access to reliable, fact-based information from professional journalists. Safeguarding media independence is vital for a healthy democracy.

    • Robert Miller on

      I agree. The decline of local journalism is a serious issue that needs to be addressed. Reliable news sources are essential for an informed public.

  3. This report highlights the challenges of navigating today’s digital information landscape. The shift towards AI-driven content and social media is a double-edged sword – it provides new avenues for engagement, but also creates risks of misinformation and erosion of professional journalism.

    • Noah Martinez on

      Well said. We need to find ways to harness the benefits of AI and technology while also preserving the integrity of news and information. It’s a complex issue without easy solutions.

  4. Patricia Garcia on

    The growing role of AI in content filtering is a concerning trend. While technology can be a valuable tool, the prioritization of engagement over accuracy is troubling. We must ensure that reliable, fact-based journalism remains accessible to the public.

  5. Michael Rodriguez on

    The dominance of opaque algorithms and AI-driven content is a worrying development. We need to find ways to promote transparency and accountability in the digital information ecosystem, while also supporting the sustainability of professional journalism.

  6. Linda Taylor on

    The report’s findings on the erosion of traditional news sources and the growth of AI-generated content are quite concerning. Maintaining a well-informed citizenry should be a top priority, and this issue deserves serious policy attention.

  7. Liam S. Martinez on

    Interesting to see how AI is shaping the online content landscape. While algorithms can be useful, I’m concerned about their potential to prioritize engagement over factual accuracy and quality journalism. We need to find the right balance.

  8. Elizabeth Garcia on

    This report highlights the complex challenges posed by the changing media landscape. While technological advancements can bring benefits, we must be vigilant in ensuring that reliable, fact-based information remains readily available to the public.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.