Listen to the article
Israel has significantly increased its propaganda budget aimed at influencing American Christians and artificial intelligence users, according to documents recently obtained by government transparency advocates.
The Israeli Ministry of Foreign Affairs has allocated approximately $9 million for a comprehensive influence campaign targeting evangelical churchgoers and users of popular AI platforms like ChatGPT and Google Bard in the United States. The initiative comes as Israel seeks to shore up international support amid ongoing conflicts in Gaza and Lebanon.
Internal communications reveal that Israeli officials have expressed concerns about waning support among traditionally pro-Israel demographics, particularly younger evangelicals. One ministry document notes that while older generations of American Christians have maintained strong support for Israel, there has been “troubling erosion” among those under 35.
“We need to reestablish our narrative before an entire generation becomes unreachable,” the document states, highlighting a strategic shift toward digital influence operations rather than traditional diplomacy.
The campaign includes the creation of specialized content designed to be distributed through church networks, Christian broadcasting channels, and religious social media communities. Israeli officials have contracted several U.S.-based public relations firms with experience in religious outreach to develop materials that frame the ongoing conflicts through biblical and theological lenses.
Particularly notable is the focus on artificial intelligence platforms. The Israeli government has established a specialized team dedicated to monitoring and influencing how AI systems like ChatGPT respond to queries about Israel, Palestine, and Middle East conflicts. This includes systematic efforts to report responses deemed unfavorable to Israel’s position and working with OpenAI and other AI developers to “correct biased information.”
Dr. Sarah Goldstein, professor of media studies at Georgetown University, explains that this represents a new frontier in digital influence operations. “Governments are realizing that AI systems are increasingly where people turn for information. By shaping how these systems respond, you can effectively influence public perception without the user even realizing it’s happening.”
The strategy documents indicate that Israeli officials view American evangelical communities as a crucial support base due to their political influence and theological connections to Israel. The materials emphasize messaging around “shared Judeo-Christian values” and “defending the Holy Land” while downplaying humanitarian concerns related to Palestinian casualties.
For AI platforms, the approach is more technical. Israel has hired data scientists and prompt engineers to develop systematic ways of influencing AI training data and response algorithms. The documents indicate that officials are particularly concerned about how AI systems characterize military operations in Gaza and Lebanon.
Media watchdog organizations have expressed alarm at these developments. The Center for Digital Democracy called the campaign “an unprecedented attempt to manipulate both religious communities and emerging technologies to shape American public opinion.”
Israeli officials, when contacted for comment, defended the initiative as necessary public diplomacy in a challenging media environment. Foreign Ministry spokesperson Daniel Cohen stated, “Every nation has both the right and responsibility to ensure accurate information about its policies is available to the public.”
The campaign comes at a critical time for U.S.-Israel relations, with polls showing decreasing support for Israeli military actions among younger Americans across religious affiliations. The Biden administration has faced growing pressure from progressive Democrats to condition military aid on Israeli policy changes, while Republican lawmakers have largely maintained strong pro-Israel positions.
Experts note that Israel’s focus on evangelical communities is strategically sound given their electoral importance in American politics, particularly in swing states. According to recent Pew Research data, evangelical voters represent approximately 25% of the U.S. electorate and have historically shown higher rates of support for pro-Israel policies than other demographic groups.
The revelation of this expanded influence campaign highlights the evolving nature of international public relations in the digital age, where traditional diplomacy increasingly intersects with sophisticated information operations targeting both human and artificial intelligence systems.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


6 Comments
This seems like a concerning shift in Israel’s media strategy. Targeting religious and tech-savvy demographics with tailored propaganda is worrying. I hope they are transparent about their goals and methods.
I’m curious to learn more about the motivations behind this media campaign. Is Israel genuinely concerned about waning support, or is this an attempt to sway public opinion on contentious issues? Transparency would be appreciated.
That’s a good point. More transparency from the Israeli government would help the public understand their objectives and methods. Influence campaigns can be problematic if not conducted ethically.
Targeting AI users and religious groups with tailored propaganda is concerning. I wonder if this is a broader trend of governments using digital tools to manipulate public discourse. More oversight and scrutiny may be needed.
This is a concerning development. Governments should focus on diplomacy and transparency, not clandestine influence campaigns. I hope the public remains critical and discerning when encountering this type of messaging.
While Israel has the right to promote its interests, this campaign seems like an aggressive move to shape narratives and sway public opinion. I hope they are held accountable for the accuracy and fairness of their messaging.