Listen to the article
The Hidden Threat: How AI Algorithms Shape Our Information Landscape
Australians are increasingly turning away from traditional news sources as opaque AI algorithms dictate what information reaches them online, according to a comprehensive report from the University of Canberra’s News and Media Research Centre.
The study reveals a troubling trend: as local journalism disappears and distrust in mainstream media grows, people are relying more heavily on social media, influencers, and AI chatbots for information—platforms that operate with minimal transparency or accountability for content quality.
“It’s a murky, polluted world where opaque algorithms decide what you see,” explain researchers from the University of Canberra, highlighting how these invisible digital gatekeepers often prioritize content with little regard for accuracy or evidence-based reporting.
The problem has intensified with the rise of “zero-click” AI search results, which present information directly rather than linking to source websites. This approach dramatically reduces traffic to news outlets, further diminishing audience reach, subscription opportunities, and essential revenue streams.
Earlier this year, a News Futures: Media Policy Roundtable gathered 45 leaders from various sectors to address these challenges. Their consensus: the opacity of algorithms on digital platforms represents a fundamental threat to journalism and public trust in information.
The resulting report calls for a fundamental shift in how Australia supports and defines journalism in the digital age, especially as misinformation continues to flourish online.
“A healthy supply of quality news and information can counterbalance misinformation,” the researchers note, pointing to their findings that show a direct correlation between news consumption and people’s ability to identify false information.
Currently, both laws and public education have failed to keep pace with technological developments. There are no standardized protocols for source attribution or verification guidelines, while AI systems often function as “black boxes” with unclear lines of responsibility when they produce errors or exhibit bias.
This lack of transparency contributes to Australians’ low confidence in their ability to verify information. Only about 40% feel confident checking the reliability of websites or social media posts, while just 43% believe they can effectively determine whether online information is truthful.
The situation is further complicated by the growing prevalence of “AI slop” and hallucinations—low-quality or entirely fabricated content produced by artificial intelligence systems. These concerns have made Australians among the most worried about online misinformation globally, with 69% actively avoiding news either occasionally, sometimes, or often.
“When everything starts to look unreliable, switching off can feel like the safest option,” the researchers explain, describing how information overload and uncertainty drive disengagement.
Digital platforms serve as inherently unreliable interfaces for news delivery, using algorithms to make invisible choices that reshape public access to information without accountability. These systems can arbitrarily elevate certain content while burying others, with minimal consideration for journalistic values like accuracy or quality.
“There is no impetus for platforms to explain how their algorithms work or when they change, how news is prioritized, or how AI-generated information is produced,” the report states.
To address these challenges, the roundtable participants identified five key priorities for improving Australia’s information ecosystem. Three specifically target artificial intelligence:
First, they call for greater transparency from tech platforms about how algorithms curate news content. This includes clear labeling of AI-generated content and disclosure rules to help rebuild trust.
Second, they advocate for fair rules regarding AI use of news content, including industry-wide licensing agreements and copyright reforms to ensure news organizations are properly compensated when their work trains AI systems.
Third, they emphasize the need for nationwide media and AI literacy education, teaching people to understand algorithms and identify biases and misinformation.
The remaining recommendations include sustainable journalism funding models that recognize news as a public good, and journalism training for news influencers and digital-first content creators.
“Society can’t afford an information environment in which invisible AI dictates what we see,” the researchers conclude. “Without action, the public interest journalism that underpins democracy and social cohesion will continue to crumble.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


18 Comments
This report highlights the need for a multifaceted approach to addressing the challenges posed by AI-driven content curation. Enhancing media literacy, strengthening fact-checking, and promoting transparency in algorithmic decision-making should all be part of the solution.
The erosion of local journalism is a worrying trend that deserves more attention. As people turn to social media and AI-driven sources, the risk of losing high-quality, evidence-based reporting increases. Protecting local news outlets should be a key focus.
I agree, the decline of local journalism is a serious issue. Platforms should explore ways to support and amplify local news sources to ensure communities have access to reliable, contextual information.
This report highlights the growing influence of artificial intelligence in shaping our information landscape. While AI has many beneficial applications, the potential for bias and lack of transparency is concerning. Robust governance frameworks are needed to ensure AI is deployed responsibly.
This report underscores the need for greater transparency and accountability in the digital landscape. Algorithms that shape our information exposure should be subject to scrutiny and oversight to ensure they align with principles of media pluralism and democratic discourse.
Reliance on social media and AI-driven content is a double-edged sword. While it can provide convenient access to information, the lack of accountability and potential for bias is concerning. Developing robust standards and oversight for these platforms should be a priority.
This is certainly a concerning trend. Increased reliance on social media and AI-curated content could lead to the spread of misinformation and erode trust in quality journalism. Transparency around content curation algorithms is crucial for users to make informed decisions.
I agree, the lack of accountability for these platforms is worrying. Robust fact-checking and disclosure of algorithmic biases could help users better navigate the digital landscape.
The findings around increased reliance on social media and AI-driven content are worrying. These platforms often prioritize engagement over accuracy, which can lead to the spread of misinformation. Developing standards and accountability measures should be a priority for policymakers and industry.
I agree, this is a critical issue that deserves urgent attention. The quality and integrity of information available to the public is fundamental to a healthy democracy.
The decline of local journalism is a disturbing trend with far-reaching implications. As people turn to social media and AI-driven sources, the risk of losing vital local context and community-focused reporting increases. Protecting and supporting local news outlets should be a priority.
Absolutely. Local news is the lifeblood of many communities, providing essential information and holding local institutions accountable. Platforms and policymakers must find ways to ensure these vital sources of journalism can thrive.
This report highlights the delicate balance between technology and media. Artificial intelligence has immense potential, but it must be deployed responsibly and with transparency. Maintaining a healthy information ecosystem is crucial for an informed citizenry.
The rise of zero-click search results is an intriguing development, but it raises concerns about the future of news business models. Platforms and publishers need to find a way to balance user convenience with the sustainability of quality journalism.
Exactly, this is a complex issue that requires collaboration and compromise. Maintaining a healthy information ecosystem is in everyone’s interest, but finding the right balance will be challenging.
The rise of zero-click search results is an interesting development. While it may provide quick access to information, it could also undermine the business models of news outlets that rely on web traffic. A balanced approach is needed to serve user needs while supporting quality journalism.
That’s a good point. Platforms need to find ways to surface authoritative sources without completely cutting off revenue streams for news providers. A collaborative effort may be required to address this complex issue.
This report underscores the need for greater transparency and user empowerment in the digital information landscape. Algorithms that shape our online experiences should be subject to scrutiny and enable users to understand how content is being filtered and prioritized.