Listen to the article
Military Families Struggle with Misinformation Wave as Middle East Tensions Rise
Military families across the United States are facing unprecedented levels of anxiety and confusion amid a flood of misinformation and AI-generated content following recent U.S. operations against Iran, according to multiple military family advocacy groups.
“We’re in a totally different environment with AI and the reality around social media,” said Shannon Razsadin, CEO of the Military Family Advisory Network. “We’re seeing a lot of anxiety among military families, and the misinformation certainly does not help with that. Right now, people are really looking for information that they can count on.”
Since February 28, when the U.S. began military operations against Iran—which subsequently triggered retaliatory missile and drone attacks—military families have been bombarded with false reports and digital misinformation. These include fabricated videos claiming to show a fighter jet shot down over Basra, Iraq; Russian and Iranian media reports of a U.S. missile striking a Bahrain neighborhood; and Iranian officials falsely claiming to have captured American troops.
U.S. Central Command has been actively combating these falsehoods, recently posting a “fact check” video montage that highlighted various claims from unofficial accounts as evidence of the Iranian regime “constantly peddling lies.” However, official denials often come hours or even days after false information has already spread widely across social media platforms.
Corie Weathers, an Army spouse, described the emotional whiplash families experience when consuming both legitimate news and AI-generated content. “On one hand, real news clips and ‘cinematic’ AI-generated videos both show the horrors of ongoing combat, while social media posts from the military depict the excellence of the community,” she explained. “It can just be very, very confusing.”
What families truly need, according to Weathers, is basic information about their service members’ safety and deployment status. “Where do we get that information in order to feel secure, in order to feel informed, and also to feel like we can give that buy-in from that family readiness perspective?” she asked.
Daniella Horne, digital organizer for the Secure Families Initiative and an Army veteran who deployed twice to Afghanistan, pointed to the combination of social media content, news reports, and uncertainty from U.S. officials about the conflict’s trajectory as major stressors for military families.
“People are trying to stay informed and at the same time, support their service members, support the military,” Horne said. “Then you’re watching the news and trying to understand what the next weeks are gonna look like, what the next months are gonna look like, and if this is gonna drag on. Is this going to be another forever war?”
The rise of AI-generated content has further complicated matters. Common examples include synthetic videos showing crying soldiers in front of rubble or coffins, with some depicting service members holding photos of “dead” friends—none of which are real people or events.
Ellen Gustafson, executive director of We the Veterans & Military Families, expressed concern about AI-generated content designed to sow division, often exploiting the U.S.-Israel relationship or domestic politics. Some posts use inflammatory imagery and messaging to provoke emotional responses, such as contrasting Netanyahu’s son “basking in sunshine” with American soldiers in flag-draped coffins.
Gustafson, who founded The Homefront Sentinel to raise awareness about foreign entities and online scams targeting the military community, has tracked instances where images of fallen soldiers are repurposed for manipulation. She’s observed fake GoFundMe accounts and even AI-generated “photos” purporting to show statues of dead Americans erected in Tehran.
“A lot of AI-generated content is created to tug at heartstrings by using the fact that Americans do get emotional about our military,” Gustafson noted. “Making people more emotional about the conflict that they’re currently serving in is incredibly damaging to troop morale or divisions within units.”
Ken Ramos, a retired psychological operations soldier, explained that evoking intense emotional reactions is the “bread and butter” of psychological operations. Foreign actors exploit vulnerabilities with messaging like: “Look, you’re going to go die for these bullshit wars. This is not what I signed up for.”
The FBI and Department of Homeland Security have previously tracked tactics used by foreign actors, particularly Iran, to influence U.S. politics through AI-generated content, fake personas, and inauthentic news sites. A 2024 estimate found that 71% of images shared across social media were AI-generated, with a Europol report predicting that nearly 90% of all online content would be AI-generated by 2026.
As these digital threats evolve, military family advocacy groups continue working to provide reliable information channels and support for those caught in this complex information environment.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


10 Comments
The flood of false reports and digital misinformation must be incredibly disruptive and stressful for military families. I hope robust efforts are made to counter these trends and ensure families have access to credible information.
Yes, the impact on families’ wellbeing is concerning. Proactive steps to address misinformation and support these communities during heightened tensions are crucial.
This is a concerning issue for military families. Misinformation and AI-generated content can be very destabilizing and fuel unnecessary anxiety. It’s critical that reliable, verified information is available to support these communities during difficult times.
I agree. Providing access to trustworthy sources is key to combating the spread of misinformation. Military families deserve accurate, up-to-date information they can rely on.
Misinformation and AI-generated content can be incredibly destabilizing, especially for vulnerable communities like military families. I hope robust efforts are made to protect these families and provide them with reliable information they can trust.
Absolutely. Maintaining transparent, trustworthy communication channels is crucial during times of heightened tensions and uncertainty. The wellbeing of military families must be the top concern.
Misinformation is always problematic, but the stakes are even higher when it involves military operations and families. Reliable communication channels and fact-checking resources are essential to help military communities navigate this challenging environment.
The military community is already under significant stress. Dealing with misinformation and AI-generated content on top of that must be overwhelming. I hope policymakers and advocacy groups can find effective ways to support these families and protect them from harmful falsehoods.
Agreed. Mitigating the spread of misinformation is critical to maintaining morale and wellbeing among military families during times of heightened uncertainty and risk.
This is a complex issue with serious implications. Ensuring military families have access to accurate, verified information should be a top priority. Proactive strategies to counter misinformation campaigns are essential.