Listen to the article

0:00
0:00

In a digital age where information spreads at lightning speed, local governments across the UK are battling a growing threat to democracy: sophisticated fake council communications that can reach hundreds of thousands of people in mere hours.

City of York Council recently found itself at the center of several viral misinformation campaigns when fabricated council notices began circulating widely on social media. One falsely advertised for homeowners to house asylum seekers, another sought volunteers to remove St. George flags, and a third requested citizens to fill potholes themselves.

At first glance, these notices appeared legitimate, featuring what looked like official council logos and formatting. However, closer inspection revealed telltale signs of forgery. BBC Verify, which analyzed the images, noted blurry logos lacking proper detail, inconsistent fonts, spelling errors, and anatomical anomalies like misshapen hands – common indicators of AI-generated content.

“It’s totally untrue and it’s fake, and the problem is that people don’t think that’s the case,” said City of York Council leader Clare Douglas. While acknowledging that misinformation isn’t new – “It started with the printing press, centuries and centuries ago” – she emphasized the modern challenge: “It’s just how easy it is to do now, and how difficult it is to detect whether something is true.”

The potential reach of such fabrications is staggering. The BBC calculated that the fake asylum seeker notice alone appeared on accounts with a combined following exceeding half a million users.

The phenomenon extends beyond York. In Barnsley, the council has confronted content creators spreading falsehoods about local government activities. When shown the fake York council images, Barnsley Council leader Sir Steve Houghton admitted, “It looks real, I wouldn’t know.”

More troubling still, Houghton revealed that some content creators refused to remove false information when approached by the council. “We’ve even had some people with content saying we’re not going to change this because we’re making money out of it. Now that is unbelievable,” he said.

These fabrications appear strategically designed to exploit divisive topics. “We do at the moment get a lot of misinformation about asylum seekers and disinformation about asylum seekers,” Houghton noted. “We’ve got to correct that because people need to be safe.”

The ease with which such convincing fakes can be created presents a formidable challenge. Ilya Yablokov from the University of Sheffield’s Disinformation Research Cluster explained that the barrier to entry for creating misinformation has essentially disappeared.

“The cost of production of this is almost zero, right? You just have a laptop and you can create a poster,” Yablokov said. “You go to the website of your council, you take a screenshot, you attach it to a leaflet then you write in big letters whatever the main message is.”

The problem is compounded by user behavior on social media platforms. Most people scroll quickly, lacking the time, interest, or motivation to verify what they’re seeing before sharing. “People just don’t have interest or motivation to make the next step and really try to [verify], because they basically rely on their biases,” Yablokov added.

As both local elections and a potential general election loom in the UK this year, the timing of these misinformation campaigns raises particular concerns about potential impact on voter decisions and democratic processes.

Local authorities have limited tools to combat the spread of false information. Most resort to using their own communications channels to issue corrections and clarifications, but these efforts often struggle to match the reach of viral misinformation.

With artificial intelligence making fake content both easier to create and harder to detect, the challenge for local governments continues to grow. As Yablokov ominously observed, “The price of creating these things is almost zero – but the cost to democracy is very high.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

12 Comments

  1. This issue highlights the need for robust regulations and oversight mechanisms to ensure AI technologies are not abused to undermine the foundations of our democratic society.

  2. Isabella Johnson on

    I’m curious to learn more about the specific tactics used in these misinformation campaigns targeting Yorkshire councils. Understanding the modus operandi could inform more effective countermeasures.

    • Olivia C. Smith on

      Good point. Detailed analysis of the fabricated content, its dissemination channels, and the intended impacts could provide valuable insights to strengthen the defenses of local governments against such attacks.

  3. Robert Williams on

    This is deeply concerning. AI-driven misinformation can undermine public trust in local government and democratic processes. Rigorous verification and education efforts will be crucial to combat these emerging threats to democracy.

    • Isabella Taylor on

      Absolutely. Residents need to be vigilant in scrutinizing online content and verify information from official council channels. Authorities must also invest in robust digital security measures.

  4. This is a timely reminder of the power and perils of AI technology. While it can bring many benefits, unscrupulous actors can also leverage it to erode public trust and undermine democratic institutions.

    • Patricia Lopez on

      Indeed. Responsible development and deployment of AI, coupled with robust safeguards, will be crucial to ensuring these transformative technologies empower rather than endanger our communities.

  5. Mary Rodriguez on

    Tackling AI-driven misinformation will require a multi-pronged approach involving local authorities, technology companies, and engaged citizens. Vigilance, digital literacy, and fact-checking will all play a vital role.

  6. Oliver Martin on

    The proliferation of fake council communications is a worrying trend. Authorities will have to stay one step ahead of these sophisticated AI-powered disinformation campaigns to protect the integrity of local democracy.

    • Isabella Thomas on

      Agreed. Improving digital literacy and media literacy among citizens is key to building resilience against manipulative content. Transparency and proactive communication from councils can also help counter misinformation.

  7. Ava Rodriguez on

    The blurring of fact and fiction in the digital age is a serious threat to democratic governance. Councils must stay proactive and innovative in their efforts to protect the public from malicious disinformation campaigns.

    • William J. Jones on

      Absolutely. Strengthening digital infrastructure, enhancing communication strategies, and fostering media literacy will all be crucial to safeguarding local democracy in the face of these emerging challenges.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.