Listen to the article

0:00
0:00

Russian Disinformation and AI Target German Voters Ahead of Critical Election

German voters face an unprecedented wave of digital manipulation as Sunday’s Bundestag election approaches, with experts identifying sophisticated Russian disinformation campaigns and AI-generated content promoting far-right narratives.

Security researchers have linked two Russian groups—”Doppelganger” and “Storm-1516″—to coordinated efforts targeting German political discourse. These same organizations were active during the U.S. election last year, according to American intelligence officials.

“It’s not about just one fake video or one fake article. There’s a systematic effort to constantly create this flood of false stories and propaganda,” says Julia Smirnova, a senior researcher at the Center for Monitoring, Analysis and Strategy (CeMAS), a non-profit specializing in disinformation analysis.

The campaigns have deployed increasingly sophisticated tactics, including deep-fake videos featuring fabricated “witnesses” or “whistleblowers” making false accusations against mainstream politicians. In one notable case from November 2024, an AI-generated video falsely claimed that Dr. Marcus Faber, head of the government’s defense committee and a vocal Ukraine supporter, was a Russian spy. In another instance, an AI-created video showed an 18-year-old woman making false child abuse allegations against a German minister.

Researchers have traced these disinformation efforts to the Russian PR firm Social Design Agency, widely reported to have Kremlin connections. The group’s primary strategy involves creating counterfeit news articles mimicking established German publications, then distributing them through networks of social media accounts.

From mid-December 2024 to mid-January 2025 alone, CeMAS identified 630 German-language posts with “typical Doppelgänger patterns” on X (formerly Twitter). Many posts pose as concerned citizens questioning Germany’s support for Ukraine or highlighting domestic economic challenges.

The far-right Alternative for Deutschland (AfD) party, currently polling in second place nationally, has been particularly active in the social media sphere, outpacing other parties in online engagement. Cybersecurity policy adviser Ferdinand Gehringer notes this alignment is no coincidence: “Russia sees within the AfD’s program and ideas the best options for future cooperation,” he explains, citing the party’s opposition to arming Ukraine and support for Russian gas imports.

CeMAS has documented at least one case where an AfD politician, parliamentary member Stephan Protschka, amplified a narrative originating from Russian disinformation campaigns. Protschka shared posts claiming the Green Party was collaborating with Ukraine to stage crimes that would be blamed on the AfD.

Beyond foreign influence operations, domestic far-right groups are leveraging AI to shape public opinion. One striking example is “Larissa Wagner,” an entirely AI-generated influencer who promotes far-right viewpoints on social media. With accounts created in the past year, the virtual persona regularly posts videos advocating for the AfD and expressing anti-immigrant sentiments.

When contacted by reporters, the account behind “Wagner” responded: “I think it’s completely irrelevant who controls me. Influencers like me are the future… Like anyone else, I want to share my perspective on things.”

The Institute for Strategic Dialogue recently identified 883 posts since April 2023 containing AI-generated content from far-right supporters and official AfD accounts. In October alone, AfD party accounts published more than 50 posts featuring AI-generated content—significantly more than other political parties.

“They’re clearly the one actor that is exploiting this technology the most,” says Pablo Maristany de las Casas, an analyst who co-authored the report. The content typically follows two narrative tracks: attacking migrants by portraying them as criminals and glorifying traditional German values.

A recent survey by the Bertelsmann Foundation found that 80% of Germans consider online disinformation a major societal problem, and 88% believe it is spread to influence political opinions. Cathleen Berger, a senior researcher at the foundation, notes that foreign disinformation becomes truly impactful “when it is being picked up by domestic actors.”

As Germans prepare to elect a new parliament on Sunday, these coordinated manipulation efforts highlight the evolving challenges to democratic discourse in the digital age, where distinguishing genuine political expression from artificial manipulation becomes increasingly difficult.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

16 Comments

  1. I’m curious to learn more about the specific Russian groups involved and their tactics. Understanding the source and methods of disinformation is key to combating it effectively.

    • Absolutely, transparency around the perpetrators and their modus operandi will be crucial for developing countermeasures.

  2. Michael Thompson on

    It’s concerning to see the far-right gaining ground online, especially through the use of AI-generated content. Voters must remain vigilant and rely on authoritative, fact-based sources.

    • Oliver Johnson on

      Agreed. Safeguarding democratic processes from malicious actors should be a top priority for the German government and tech companies.

  3. Michael Thomas on

    The use of deep-fake videos to spread false narratives is particularly alarming. Fact-checking and media literacy will be essential for voters to discern truth from fiction.

    • Agreed. Educating the public on spotting manipulated media should be a priority leading up to the election.

  4. William S. Lopez on

    This is a concerning development that highlights the need for greater regulation and oversight of social media platforms. Voters must be able to trust the information they’re consuming.

    • Emma Martinez on

      I agree. Strengthening transparency and accountability measures for tech companies could help mitigate the spread of disinformation.

  5. The systematic effort to flood the discourse with false narratives is alarming. Robust fact-checking and media literacy campaigns will be essential to combat this threat.

    • Linda C. Jackson on

      Absolutely. Empowering voters to critically evaluate online information will be key to preserving the integrity of the German election.

  6. The use of AI-generated content to spread false narratives is a worrying trend. Voters must be vigilant and rely on authoritative, fact-based sources to make informed decisions.

    • Olivia Williams on

      Absolutely. Developing effective strategies to detect and counter AI-generated propaganda will be crucial leading up to the election.

  7. James R. Lopez on

    It’s disheartening to see the far-right gaining ground through the use of disinformation and AI-generated content. Maintaining the integrity of democratic processes should be a top priority.

    • I agree. Protecting the German election from foreign interference and domestic extremism is essential for the health of the country’s democracy.

  8. Elizabeth G. Moore on

    Interesting to see how disinformation and AI-generated content are being leveraged to target German voters. Maintaining the integrity of elections is critical for democracy, so it’s concerning to hear about these sophisticated tactics.

    • Isabella Jones on

      Absolutely, it’s a worrying trend that requires vigilance and a strong response from authorities and social media platforms.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.