Listen to the article
A new investigation has uncovered widespread terrorist propaganda and systemic bias on Arabic-language Wikipedia, raising alarm about the reliability of one of the world’s most trusted information sources.
The World Jewish Congress’s Institute for Technology and Human Rights released a report Tuesday detailing how Arabic Wikipedia consistently violates the platform’s fundamental neutrality principles, particularly in articles about the Israeli-Palestinian conflict and the Hamas attack on southern Israel on October 7, 2023.
According to the investigation, between 25 and 50 percent of citations in major Wikipedia articles come directly from Hamas, Hezbollah, and other designated terrorist organizations. This heavy reliance on extremist sources effectively amplifies radical narratives to millions of Arabic-speaking readers worldwide.
The report highlights disturbing content patterns where terrorist groups like Hamas and Palestinian Islamic Jihad are portrayed as legitimate resistance movements, while attacks targeting civilians are euphemistically labeled as “martyrdom operations.” Some articles go further by celebrating suicide bombings and civilian attacks as historical “achievements.”
“This report demonstrates that one of the world’s most trusted knowledge platforms is being systematically manipulated to promote extremist narratives,” said Yfat Barak-Cheney, executive director of WJC’s Institute for Technology and Human Rights. “When terrorist propaganda and hate-driven narratives are allowed to masquerade as neutral information, the consequences extend far beyond Wikipedia itself.”
The findings come amid growing concerns about Wikipedia’s content integrity. Just last month, the platform faced criticism after reports that a human rights group allegedly linked to Hamas had begun training Palestinians to edit pages related to Israel and the Gaza war, potentially increasing the spread of anti-Israel propaganda.
The implications extend beyond Wikipedia itself. As artificial intelligence systems increasingly rely on Wikipedia for training data, biased or extremist content risks being amplified across multiple platforms. The WJC report recommends that technology companies and search engines implement safeguards when using Wikipedia content until meaningful reforms are established.
This is not the first time Wikipedia’s content practices have drawn official scrutiny. Last year, the U.S. House Committee on Oversight and Government Reform opened an investigation into the Wikimedia Foundation over concerns that foreign actors were exploiting the platform to insert anti-Israel or antisemitic framing.
Months earlier, the U.S. Justice Department warned the Foundation that its nonprofit status could be jeopardized for potentially violating its “legal obligations and fiduciary responsibilities” under U.S. law, specifically citing concerns about propaganda and foreign manipulation of information.
The World Jewish Congress is calling for immediate action from the Wikimedia Foundation to restore neutrality on the Arabic-language platform. Their recommendations include enforcing existing neutrality standards, removing administrators who enable extremist content, and implementing centralized monitoring for terrorism-related material.
“Wikipedia has long presented itself as humanity’s shared knowledge repository,” Barak-Cheney noted in Tuesday’s statement. “Ensuring that this knowledge remains factual is particularly critical as emerging AI platforms increasingly rely on multilingual information sources to formulate responses to user queries.”
The findings highlight the growing challenge of maintaining factual integrity in the digital age, where the line between information and propaganda can easily blur. For Wikipedia, which serves as a primary source of information for millions worldwide, addressing these concerns is increasingly urgent as its content not only shapes public opinion but also influences the next generation of AI-powered information systems.
The Wikimedia Foundation, which operates Wikipedia, now faces mounting pressure to address these serious allegations and implement more rigorous oversight of content across all language versions of its platform, particularly in regions with complex geopolitical conflicts.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


10 Comments
If these allegations are true, it’s a serious breach of Wikipedia’s principles. An encyclopedia should be a bastion of objective, fact-based information, not a platform for extremist narratives. Decisive action is required.
This is a worrying development that could have serious consequences. Reliable information sources must remain objective and fact-based, not become platforms for extremist propaganda. Rigorous fact-checking is essential.
Absolutely. Wikipedia needs to thoroughly investigate these claims and implement robust safeguards to prevent such manipulation in the future.
This is very concerning if true. Reliable information sources should strive for neutrality, not amplify terrorist propaganda. Fact-checking and transparency are essential for platforms like Wikipedia to maintain public trust.
If these allegations are accurate, it’s a major breach of Wikipedia’s principles. Allowing terrorist groups to shape narratives is incredibly dangerous and goes against the core purpose of an encyclopedia. Urgent reforms are required.
This is a very concerning development that undermines the credibility of a widely-used information platform. Wikipedia must take immediate steps to address these issues and restore its commitment to neutrality and factual accuracy.
I agree. Allowing terrorist propaganda to proliferate on Wikipedia is completely unacceptable. Urgent action is needed to rectify this situation and ensure it does not happen again.
It’s vital that Wikipedia maintains its integrity as a trusted, impartial source of knowledge. Allowing terrorist groups to influence content is a major breach of public trust that cannot be tolerated.
Deeply troubling that such biased and extremist content would be allowed on a platform like Wikipedia. This undermines the credibility of an otherwise valuable source of information. Stricter editorial oversight is clearly needed.
I agree. Wikipedia must take strong action to address these issues and restore its reputation for objective, fact-based content.