Listen to the article

0:00
0:00

Digital platforms across the UK are facing an increasingly complex battle against misinformation, but current regulatory frameworks are falling short, according to expert testimony recently submitted to Parliament.

Dr. Elena Abrusci, Senior Lecturer in Law at Brunel University London, delivered a sobering assessment to the Innovation and Technology Committee’s inquiry on “Social Media, Misinformation and Harmful Algorithms” in December. Her analysis highlights critical gaps in the UK’s approach to combating harmful content online.

“Misinformation and disinformation have long existed in society, yet policy responses remain limited and largely ineffective,” Dr. Abrusci stated in her written evidence. The testimony comes at a crucial time when digital platforms are increasingly scrutinized for their role in spreading false information.

According to the submission, conventional approaches to tackling misinformation—including content moderation, media literacy programs, and regulatory frameworks—have achieved minimal success in reducing societal harm. This raises significant concerns about the effectiveness of current strategies as misinformation continues to proliferate across social platforms.

The emergence of generative artificial intelligence has further complicated the landscape. While AI has substantially increased the volume of misinformation circulating online, Dr. Abrusci argues that it hasn’t fundamentally altered the nature of the problem or the societal damage caused by false information. Rather, it has amplified existing challenges and accelerated the spread of misleading content.

A central focus of the testimony was the UK’s Online Safety Act, which Dr. Abrusci criticizes for several shortcomings. The legislation struggles to strike an appropriate balance between protecting freedom of expression and preventing harm. Its definitions remain vague, potentially leading to inconsistent application, and it fails to provide regulators with sufficient enforcement powers to effectively address violations.

The regulation of public debate presents a particular challenge, requiring careful consideration of competing rights. While free expression must be protected, the rights of individuals who may be harmed by certain content—either directly or indirectly—must also be safeguarded. This delicate balance is not adequately addressed in current frameworks, according to the testimony.

Emerging technologies pose additional regulatory challenges that current legislation fails to address. The Online Safety Act does not sufficiently tackle the growing threat of deepfakes—highly convincing but fabricated media content. The lack of clear guidelines for service providers creates ambiguity in enforcement, potentially leading to either ineffective moderation or overreach that could suppress legitimate speech.

Dr. Abrusci emphasized that addressing misinformation effectively requires coordination beyond a single regulatory body. While Ofcom holds primary responsibility under the current framework, truly effective oversight would necessitate collaboration with multiple authorities, including the Electoral Commission, Advertising Standards Authority, and Equality and Human Rights Commission.

The testimony comes amid growing global concern about the impact of misinformation on democratic processes, public health, and social cohesion. In recent years, false information spread through social media has been linked to vaccine hesitancy, election interference, and increased polarization across society.

Industry experts have long argued that technology companies must take greater responsibility for content on their platforms, while civil liberties advocates caution against excessive restrictions on speech. Finding the right balance remains elusive, as evidenced by Dr. Abrusci’s critique of current approaches.

The Innovation and Technology Committee’s inquiry continues to gather evidence as policymakers grapple with developing more effective frameworks to address online harms while preserving the benefits of digital communication and open discourse.

As social media platforms continue to evolve and new technologies emerge, the need for adaptive, coherent, and effective regulatory approaches becomes increasingly urgent—a challenge that, according to Dr. Abrusci’s testimony, remains substantially unmet by current UK legislation.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

8 Comments

  1. The proliferation of misinformation on social media is a worrying trend. Strengthening media literacy and improving content moderation will be important first steps, but more comprehensive solutions are clearly needed.

    • Mary Rodriguez on

      Absolutely. Innovative thinking and cross-sector collaboration will be key to finding effective long-term solutions to this problem.

  2. Jennifer Lopez on

    Misinformation can have far-reaching societal consequences. I’m curious to learn more about the specific gaps in the UK’s current approach and what alternative policy frameworks might be more effective.

    • Mary Rodriguez on

      Good point. The expert’s analysis suggests that conventional methods have had limited success – understanding the shortcomings could help inform more innovative solutions.

  3. This expert testimony underscores the urgent need to address the harmful impacts of misinformation and problematic algorithmic design on digital platforms. Tackling this issue requires a multi-pronged approach.

  4. This expert testimony highlights the significant challenges in combating misinformation on digital platforms. Nuanced, multifaceted approaches will be crucial to address this complex issue effectively.

    • I agree, the current regulatory frameworks seem inadequate. Policymakers must work closely with experts to develop more robust and adaptive strategies.

  5. This is an important issue that deserves serious attention. Developing robust yet flexible strategies to combat misinformation across digital platforms will be crucial going forward.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.