Listen to the article

0:00
0:00

Social media giants faced harsh criticism from MPs during a confrontational parliamentary hearing, where lawmakers accused platforms of failing to address misinformation, deepfakes, and harmful content despite their claims of robust safety measures.

The Commons science, innovation and technology committee hearing highlighted the growing frustration among British legislators with tech companies’ inability to effectively combat online harms. Representatives from X, TikTok, and Meta presented their safety protocols but were met with skepticism and evidence contradicting their claims.

In one particularly pointed exchange, TikTok’s director of public policy for northern Europe, Alistair Law, insisted the platform prohibits pornography, nudity, and harassment. MP Freddie van Mierlo immediately countered this assertion, revealing he had discovered “numerous examples this morning” of TikTok videos providing instructions on using Elon Musk’s Grok AI to create nude images of young girls.

The committee also challenged X’s claimed political neutrality. Wifredo Fernández, X’s director of global government affairs, maintained the platform was “politically agnostic” despite MP Emily Darlington citing research showing it promotes right-wing content. She also referenced Musk’s recent endorsement of the far-right UK political party Restore as “the only way to save Britain.” Fernández attempted to separate Musk’s personal views from the platform’s position, stating, “Mr Musk posts and participates in the public conversation individually… We don’t have a political perspective as a platform”—a claim committee chair Dame Chi Onwurah said “many might dispute.”

Former Conservative minister George Freeman MP shared his personal experience with harmful deepfakes, describing how a fabricated video falsely showing him defecting to Reform UK circulated on X, Facebook, and YouTube last September. When asked if X had taken any action, Fernández admitted he would “have to check with the teams,” to which Freeman responded, “The answer’s no.”

Freeman expressed concern about “the complacency of the platforms,” warning that the upcoming May elections “could be seriously disrupted” by similar deepfakes and misinformation campaigns.

The confrontational hearing comes amid a public consultation on potential legal changes regarding children’s access to social media. Options under consideration include age restrictions, curfews, and time limits. Freeman also proposed making the misappropriation of a person’s identity illegal to protect citizens from waking up to find “a deeply damaging, disruptive and dangerous misrepresentation” of themselves online.

Meta faced particular scrutiny when Dr. Lauren Sullivan MP referenced a recent National Education Union experiment where accounts created for 13-year-olds were quickly exposed to “violent and misogynistic self-harm, extremist content.” She described the material as “appalling” and too graphic to show during the hearing. Rebecca Stimson, Meta’s UK public policy director, responded by promising to “look at it very closely and take that very seriously.”

MP Martin Wrigley accused the tech executives of complacency, stating they began the hearing claiming “everything’s fine” while the committee had “demonstrated a number of different occasions when things are not fine” on their platforms.

Chair Dame Chi Onwurah summarized recent failures, citing misinformation about the Bondi Beach victim, election interference, fake photos of burning US aircraft carriers as part of Iranian misinformation, and fabricated evidence regarding a missile attack on a school in Iran.

“The basic fact is that all the work that you tell us that you are doing on online harms and to make your platforms safe in this country is not working,” Onwurah concluded, suggesting this view reflects “the consensus of most of the British people.”

She delivered a clear ultimatum to the tech companies: demonstrate meaningful progress in making their products safe for British citizens within months or face “further legislation to make it safe, because the first duty of any government is to protect its citizens.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

7 Comments

  1. Jennifer Thomas on

    The lawmakers’ skepticism towards the tech companies’ claims of political neutrality is understandable. These platforms have immense influence and need to be transparent about their policies and potential biases.

    • Jennifer Davis on

      I agree. Even a perception of political bias or favoritism could undermine public trust in these critical communication channels.

  2. Oliver Lopez on

    This is concerning if social media platforms are indeed failing to address misinformation and harmful content as they claim. Transparency and accountability from tech companies on these issues is crucial.

  3. Robert Williams on

    I’m curious to learn more about the specific examples of misinformation and deepfakes that were uncovered during the parliamentary hearing. Fact-checking and content moderation seem to be ongoing challenges for these platforms.

    • Yes, the example of TikTok videos providing instructions on creating nude images of minors is particularly alarming and unacceptable. Platforms must do more to protect vulnerable users.

  4. Isabella Garcia on

    It’s good to see lawmakers taking a proactive approach in holding social media companies accountable. Continued public scrutiny and pressure may be necessary to drive meaningful change in this space.

  5. This is a complex issue with no easy solutions. While online harms are a serious concern, overly restrictive content moderation can also raise free speech issues. Finding the right balance is an ongoing challenge.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.