Listen to the article
AI Disinformation Campaign Targets Assam Elections, Report Finds
A newly released report has identified what researchers call “the first industrialised artificial intelligence disinformation operation” in an Indian state election, occurring during the Assam Assembly election campaign.
The Foundation Diaspora in Action for Human Rights and Democracy, which conducted the study, claimed the disinformation effort formed part of a broader strategy in which Muslims were “simultaneously dehumanised, disenfranchised, displaced and erased from cultural memory.” The findings were published Tuesday, just two days before voters head to the polls in Assam, with results scheduled for announcement on May 4.
According to the report, researchers uncovered 31 confirmed deepfakes targeting Assam Congress chief Gaurav Gogoi. These manipulated media fabricated his identity as “a Pakistani agent and Muslim sympathiser.” The study also documented 119 breaches of the model code of conduct, noting that the Election Commission allegedly failed to act in any of these cases.
Social media platforms were similarly criticized for neither removing the problematic content nor applying labels to indicate AI-generated material, despite their stated policies regarding synthetic media.
Researchers expressed concern that the tactics deployed in Assam—including voter roll purges, demographic engineering, and AI-generated communal content—are being replicated elsewhere in India, particularly in poll-bound West Bengal. “Assam is the laboratory; the rest of India is the intended market,” the report warned.
This allegation appears to align with separate claims from West Bengal Chief Minister Mamata Banerjee, who has stated that specific communities—including Matuas, Rajbanshis and minorities—were being targeted for removal from voter rolls through special intensive revision exercises.
Sophisticated Disinformation Architecture
The report details an extensive “disinformation architecture” established before the elections, comprising synthetic images, deepfake videos, and AI-generated content designed to inflame communal tensions. Researchers identified 432 posts on Facebook and Instagram they classified as “very likely” or “likely” to have been AI-generated, which collectively garnered 45.4 million views and over 100,000 likes.
A single Instagram account named “politooons” was responsible for 88% of these views, generating 40.2 million views from 102 AI-generated posts, according to the findings.
In a particularly troubling example, the report cited a post uploaded and subsequently removed by the Assam BJP that depicted Chief Minister Himanta Biswa Sarma “symbolically firing at images of two Muslim men at point-blank range.” This post reportedly combined authentic footage of Sarma handling rifles with AI-generated images portraying Muslims as targets, and was removed following public criticism on social media.
When questioned about the video in a March 12 interview with Aaj Tak, Sarma reportedly said the content “was correct” but should have identified the men as Bangladeshis rather than Muslims. The advocacy group interpreted this statement as evidence that the shift in terminology from targeting “Miya” Muslims to Bangladeshis represented “a deliberate legal adjustment rather than a change in intent.”
Context of Communal Tension
The disinformation campaign unfolds against a backdrop of rising anti-Muslim rhetoric in Assam. In the state, “Miya” is used as a derogatory term for undocumented immigrants, specifically directed at Muslims of Bengali origin who migrated to the region during the colonial period. These communities are frequently accused of being undocumented migrants from Bangladesh.
The report alleges that in the lead-up to the elections, verified social media accounts belonging to Chief Minister Sarma and cabinet ministers published posts calling for the exclusion and economic boycott of “Miya” Muslims. These messages were subsequently “amplified through paid media at scale,” according to the findings.
The emergence of sophisticated AI-generated disinformation in Indian state elections raises significant concerns about electoral integrity and the potential for technology to exacerbate existing social divisions. As AI tools become more accessible and their outputs more convincing, the challenge of maintaining democratic processes free from manufactured manipulation grows increasingly complex.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


28 Comments
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Disinformation might help margins if metals stay firm.
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Uranium names keep pushing higher—supply still tight into 2026.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.