Listen to the article

0:00
0:00

Google Raises Freedom of Expression Concerns Over UK Online Safety Proposals

Google has expressed concerns about potential threats to freedom of expression in response to proposals from Ofcom, Britain’s independent media regulator, regarding implementation of the Online Safety Act. However, claims circulating on social media that Google received specific takedown requests for “hate speech” content from Ofcom appear to be unfounded.

Posts shared widely on Facebook, Instagram, X, and TikTok in late December alleged that “Google has exposed that they received a series of requests from the OFCOM regulator to censor content that they considered ‘hate speech’.” Some posts went further, claiming Google had “declared The Labour Party is a threat to free speech.”

These claims gained traction after the UK edition of the International Business Times published an article quoting Google as warning that Britain risked “authoritarian irrelevance” in its approach to online safety legislation. However, Google has confirmed to Reuters that it never used this phrase in any of its submissions to UK authorities.

The Online Safety Act, passed by the UK Parliament in 2023, establishes strict requirements for online platforms to address criminal activity, illegal content, and child safety issues. Critics of the legislation have argued that the rules could be applied too broadly, potentially leading to censorship of legal content.

When contacted by Reuters, Google explicitly stated it was not aware of any requests from Ofcom to remove specific content. An Ofcom spokesperson clarified the regulator’s role is to ensure websites and apps have proper systems in place to comply with the Online Safety Act, not to direct platforms to remove specific posts or accounts.

Google’s actual submission to Ofcom’s consultation, published online on December 11, expressed concerns about a specific proposal that would hide potentially illegal content from users until platforms could review it. Google argued this measure appeared to create a “general monitoring obligation” that could infringe on freedom of expression, as content initially flagged as “potentially illegal” might ultimately be determined to be legal.

“We will consider all responses to our consultation carefully before making our final decisions next year,” an Ofcom spokesperson told Reuters. “There is nothing in our proposals that would require sites and apps to take down legal content. In fact, in carrying out their duties to keep people safe, the Online Safety Act requires platforms to have particular regard to the importance of protecting users’ right to freedom of expression.”

The spokesperson added that the proposals aim to prevent the rapid spread of potentially harmful illegal content, particularly during crisis situations: “If illegal content spreads rapidly online, it can lead to severe and widespread harm, especially during a crisis. Recommender systems can exacerbate this.”

The consultation process represents a critical phase in determining how the Online Safety Act will be implemented in practice. Tech companies, civil liberties advocates, and government officials continue to debate the appropriate balance between online safety and freedom of expression.

Britain’s Department for Science, Innovation and Technology declined to provide on-the-record comments when approached by Reuters regarding this matter. The International Business Times did not respond to requests for clarification about their reporting.

The misinformation surrounding Google’s position highlights the contentious nature of online content regulation and the challenges governments face in crafting policies that both protect users and preserve freedom of expression in the digital age.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

18 Comments

  1. Interesting to see Google raising concerns about potential threats to free speech from the UK’s online safety proposals. It’s a delicate balance between protecting users and preserving fundamental rights like freedom of expression.

    • You’re right, this is a complex issue without easy answers. Reasonable people can disagree on how to strike the right balance.

  2. Ava O. Johnson on

    This debate highlights the ongoing tensions between online safety and free expression. I hope all sides can engage constructively to find solutions that work for users, companies, and society.

  3. The claims about Google declaring the Labour Party a threat to free speech seem unfounded. I’d encourage fact-checking before sharing such politically charged allegations.

  4. Michael V. Lopez on

    While I understand Google’s concerns, I’m curious to hear more about the specific provisions in the UK’s Online Safety Act and how they may impact tech companies’ content moderation practices.

    • Elizabeth Brown on

      Good point. More details and transparency from policymakers and regulators would help inform a balanced discussion.

  5. While I appreciate Google’s concerns, I’m curious to hear more details on the specifics of the UK proposals and how they may impact freedom of expression.

    • Elizabeth Brown on

      Good point. More transparency from both sides would help the public understand the potential implications more clearly.

  6. It’s concerning to see potentially unfounded claims spreading on social media. I’d encourage everyone to rely on authoritative, fact-based sources when discussing online content regulation.

    • Agreed. Misinformation and unsubstantiated allegations can only undermine productive discourse on this important topic.

  7. Isabella Smith on

    This is a complex issue with valid arguments on multiple sides. I hope the UK can find a sensible solution that protects users without unduly restricting legitimate speech.

  8. Elizabeth Davis on

    This highlights the ongoing challenges of regulating social media and online content. Balancing user safety, free speech, and transparency will require nuanced policymaking.

    • John R. Rodriguez on

      Well said. Policymakers will need to carefully consider all stakeholder interests to find an approach that works.

  9. The claims about Google receiving specific takedown requests seem unfounded. It’s important to verify information, especially on sensitive topics like online content regulation.

    • Agreed, fact-checking is crucial. Rushing to conclusions without solid evidence can lead to the spread of misinformation.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.