Listen to the article

0:00
0:00

Experts are warning that China and Russia have refined their disinformation tactics, now using sophisticated methods that blend truths with falsehoods to manipulate online discourse and public opinion. According to Japanese analyst Masakazu Takamori, this evolution represents a significant challenge for democratic nations.

Speaking at a recent seminar in Taiwan, Takamori, who serves as CEO of Japan Nexus Intelligence, explained that foreign operatives have moved beyond completely fabricated narratives to more nuanced approaches. These include deploying generative artificial intelligence to create content that avoids the telltale linguistic errors often made by non-native speakers, effectively concealing the foreign origin of propaganda.

“We’ve detected evidence of ‘bot bombs’ in Japan,” Takamori noted, describing algorithms designed to identify specific keywords and generate massive amounts of content around targeted narratives. These operations artificially boost engagement rates to ensure controversial topics quickly rise to the top of social media feeds.

The cognitive warfare directed at Japan concentrates on several strategic themes. One focuses on Okinawa, promoting false claims that it is rightfully Chinese territory that was forcibly annexed by Japan. Another targets U.S. military installations in Japan by amplifying grievances about the “unequal burden” placed on local residents—aiming to foster anti-American sentiment and discontent toward the central government in Tokyo.

Takamori presented evidence showing how foreign operatives consistently portray Japan and Taiwan as militarily inferior to China. One propaganda image he shared depicted China as a giant panda pursuing Japan and Taiwan, portrayed as small, helpless rats—a visual metaphor designed to instill a sense of inevitable defeat.

More recently, these disinformation campaigns have begun targeting Japanese Prime Minister Sanae Takaichi personally, labeling her as an extreme right-winger and suggesting that Japanese businesses associated with her administration could face retaliation from the Chinese government. “These tactics are clearly intended to undermine the Takaichi administration’s national security and defense positions,” Takamori observed.

The expert identified additional propaganda themes seeking to exploit domestic social tensions, including controversies around gender equality and resource distribution. These campaigns attempt to fuel youth discontent against the established order. Foreign actors also commonly accuse Tokyo of ineffective responses to natural disasters and inadequate victim relief, thereby provoking public anger toward government institutions.

“The pattern of cognitive warfare attacks targeting Taiwan is almost identical to what we’re seeing in Japan, especially during election periods,” Takamori said, noting that the primary difference is simply the language used. “Hostile forces are exploiting freedom of speech in our democracies to spread their lies.”

Both Japan and Taiwan face constraints in countering disinformation through direct or forceful means. Current strategies rely heavily on commentators, influencers, and academics to rebut propaganda. However, Takamori argues this reactive “fire brigade” approach is unsustainable, as it places governments perpetually on the defensive.

“Defending against cognitive warfare requires governments to anticipate foreign propaganda and implement pre-emptive measures to strengthen the public’s psychological resilience,” he emphasized. This necessitates stronger international cooperation in sharing data about hostile disinformation campaigns.

Takamori specifically called for enhanced collaboration between Taiwan and Japan, suggesting that pooling resources and intelligence could help both democracies better withstand information warfare from authoritarian regimes.

As cognitive warfare tactics grow increasingly sophisticated with advances in artificial intelligence and social media algorithms, democratic nations face mounting pressure to develop effective countermeasures that preserve freedom of expression while protecting against foreign manipulation of public discourse.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

16 Comments

  1. Emma Thomas on

    The ‘bot bombs’ described in the report sound like a particularly pernicious tactic. I’m curious to learn more about the specific algorithms and techniques used to artificially boost engagement and visibility of targeted narratives.

    • Ava Hernandez on

      That’s a good point. Understanding the technical details of these operations will be important for developing effective countermeasures.

  2. Elijah Martinez on

    It’s troubling to see how foreign actors are leveraging sophisticated techniques to manipulate online discourse. We must remain vigilant and work to educate the public on identifying and resisting these deceptive tactics.

    • Elijah Lee on

      I agree. Strengthening media literacy and critical thinking skills will be crucial in helping the public navigate the information landscape and resist manipulation.

  3. John Lopez on

    This is a concerning development. The use of sophisticated AI and ‘bot bombs’ to manipulate online discourse is a serious threat to democracy. We need robust strategies to identify and counter these deceptive tactics.

    • Agreed. Strengthening digital media literacy and fact-checking capabilities will be crucial to combat these foreign disinformation campaigns.

  4. Robert Jones on

    As someone who follows the mining and energy sectors, I’m concerned about the potential impact of these foreign influence operations on public perception and policy decisions related to critical minerals and resources.

    • James Miller on

      That’s a good point. We’ll need to be vigilant in verifying information and fact-checking claims related to these industries to maintain public trust.

  5. Linda Johnson on

    It’s alarming to see how foreign actors are evolving their tactics to create more convincing propaganda. We must stay vigilant and invest in tools to detect and debunk these manipulative narratives.

    • Patricia Lopez on

      Absolutely. Increased transparency and public awareness are key to building resilience against these information warfare tactics.

  6. John Johnson on

    The use of AI-generated content to conceal the foreign origin of propaganda is a concerning development. We’ll need to invest in advanced detection and analysis capabilities to stay ahead of these evolving tactics.

    • Ava Taylor on

      Definitely. Collaboration between governments, tech companies, and the research community will be key to effectively countering these threats.

  7. Jennifer Thompson on

    Targeting Okinawa with false claims is a particularly concerning tactic, as it could stir up tensions and instability in the region. We need to closely monitor these developments and work with our allies to address the threat.

    • James Lopez on

      Yes, the geopolitical implications of these disinformation campaigns are worrying. Coordinating a robust international response will be crucial.

  8. Noah Garcia on

    While the use of AI-generated content is concerning, I’m curious to learn more about the specific techniques being employed and how they are able to conceal the foreign origin of these narratives.

    • Lucas V. Taylor on

      That’s a good point. Understanding the technical details of these operations will be important for developing effective countermeasures.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.