Listen to the article
Social media giants X, TikTok, and Meta are set to testify before the Foreign Affairs Committee as part of an ongoing inquiry into disinformation diplomacy, marking a significant step in parliamentary scrutiny of online platforms.
The high-profile hearing, scheduled to begin at 2 p.m., will feature senior representatives from three of the world’s most influential social media companies. Ali Law, Director of Public Policy for Northern Europe at TikTok, David Agranovich, Director of Global Threat Disruption at Meta, and Wifredo Fernández, Head of Americas for Global Government Affairs at X Corp (formerly Twitter), will face questioning from committee members.
The committee’s investigation aims to thoroughly examine the dual role these platforms play in both enabling and combating foreign disinformation campaigns. Lawmakers are expected to press executives on their companies’ policies, enforcement mechanisms, and overall effectiveness in identifying and neutralizing coordinated disinformation efforts from foreign actors.
This hearing comes at a critical juncture when concerns about digital manipulation of public opinion have reached unprecedented levels. Recent years have seen mounting evidence of state-sponsored disinformation campaigns targeting democratic processes worldwide, including elections, public health initiatives, and geopolitical conflicts.
TikTok, owned by Chinese company ByteDance, has faced particular scrutiny in Western nations over potential data security concerns and questions about content moderation practices. The platform’s representative, Ali Law, will likely face questions about the company’s independence from Chinese government influence and its approach to moderating political content.
Meta, parent company of Facebook and Instagram, has implemented extensive systems to combat disinformation following criticism of its role during the 2016 U.S. presidential election. David Agranovich, who leads the company’s threat disruption efforts, is positioned to address how Meta’s policies have evolved since then and what new challenges have emerged in the constantly shifting landscape of online manipulation.
X Corp, which has undergone significant changes since Elon Musk’s acquisition, will be represented by Wifredo Fernández. The platform has faced questions about its content moderation approach and staffing of trust and safety teams following extensive restructuring under new ownership.
A central focus of the inquiry is expected to be the transparency and accountability of these platforms. Legislators have repeatedly expressed frustration with what they perceive as insufficient disclosure about how algorithms promote content, how moderation decisions are made, and what metrics companies use to measure success in fighting disinformation.
The hearing reflects growing global regulatory pressure on social media companies. The European Union’s Digital Services Act has established stringent requirements for platform transparency and accountability, while the UK’s Online Safety Act has introduced new obligations for platforms to protect users from harmful content.
Industry analysts suggest this scrutiny comes amid a broader reassessment of social media’s societal impact. “These platforms have unprecedented influence over public discourse,” noted Dr. Eleanor Marsh, digital policy researcher at King’s College London. “Governments worldwide are recognizing the need for more robust oversight mechanisms that balance free expression with protection against coordinated manipulation.”
The testimony from these three tech executives will provide valuable insights into how major platforms view their responsibilities in the information ecosystem and what measures they are taking to address sophisticated disinformation threats.
The Foreign Affairs Committee’s inquiry into disinformation diplomacy is part of a wider governmental effort to understand and address the challenges posed by digital manipulation in international relations. The findings could potentially inform future legislation or regulatory approaches to platform governance.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

11 Comments
Tackling foreign disinformation is a top priority, but these platforms also need to address domestic sources of misinformation. Careful balance required between free speech and content moderation.
Good point. Domestic misinformation can be just as damaging as foreign-backed efforts. Lawmakers will likely push for more transparency around the companies’ policies and enforcement.
Disinformation is a global problem that requires global solutions. Glad to see these tech giants being held accountable, but the broader societal challenges around truth and trust online are daunting.
Testimony from X, Meta and TikTok execs could provide valuable insights into the scale and nature of foreign disinformation efforts targeting their platforms. Curious to hear their perspectives on effective countermeasures.
Yes, understanding the specific tactics and techniques used by foreign actors will be crucial. Hopefully the hearing uncovers new details that can inform policy responses.
Interesting to see these major social media platforms testifying on disinformation concerns. It’s a complex issue with no easy solutions, but transparency and accountability from these companies is important.
Agreed. Disinformation is a serious challenge that requires a multi-faceted approach. Curious to hear the executives’ perspectives on the policies and tools they have in place.
Disinformation is a complex challenge without simple solutions. Glad to see lawmakers taking a closer look at the role of social media companies in this issue. Cautiously optimistic this hearing can lead to progress.
It’s encouraging to see these tech giants being held accountable. Disinformation is a global issue that requires coordinated efforts across industry and government. Hopeful this hearing leads to meaningful change.
Curious to see how X, Meta and TikTok respond to the committee’s concerns. Responsible content moderation is crucial, but the line between that and censorship can be blurry. Tough balance to strike.
Absolutely. These platforms wield immense influence, so getting the policies right is critical. Transparency around their algorithms and decision-making processes will be key.