Listen to the article
The Trump administration has once again extended the TikTok ban-or-divest law, potentially marking the final extension before enforcement. The law, originally passed during the Biden administration, requires TikTok to transfer ownership to American entities to avoid being banned in the United States.
This development raises significant questions about the validity of longstanding concerns that the app functions as a tool of Chinese influence, and whether American ownership would truly resolve these issues. The situation holds particular relevance for Canada, where similar anxieties about foreign manipulation through social media platforms have led to TikTok bans on government devices and legislation like Bill C-18, which aims to protect domestic news sources.
Canada’s information landscape has historically been shaped by developments in the United States, a dependency that grows increasingly precarious as American politics takes a more adversarial stance toward its northern neighbor.
While TikTok has been singled out, security experts note that all major digital platforms introduce similar vulnerabilities. If the objective is to strengthen democratic security, focusing exclusively on TikTok may be too narrow, and ownership transfer alone may accomplish little – especially since reports indicate China would retain control of the app’s algorithm despite the divestment.
Concerns about TikTok primarily revolve around two major issues. The first is data security – fears that the app functions as a surveillance tool feeding information to the Chinese government. This includes both national security concerns about critical infrastructure and personal privacy risks. Many countries have responded by banning the app on government devices, believing that securing data along national borders may address this vulnerability.
The second, more politically charged fear is that TikTok operates as an influence machine. Critics worry its algorithm can be manipulated to spread propaganda, sway public opinion, censor certain viewpoints, or even interfere in elections.
These concerns intensified in 2023 when Osama bin Laden’s “Letter to America” went viral on TikTok, prompting lawmakers to cite the incident as evidence that the platform could amplify extremist content. Investigations have also revealed that topics sensitive to China, such as Tiananmen Square and Tibet, are notably harder to find on TikTok compared to other platforms.
The fundamental issue extends beyond TikTok to all social media platforms, where users have minimal insight into how their content feeds function, what factors shape them, and how they might be influencing behavior. This opacity breeds rational mistrust, not only fear of manipulation but also a tendency to view political opponents as victims of manipulation themselves.
Political rhetoric surrounding TikTok reflects these concerns. U.S. Vice President JD Vance has stated that the executive order would prevent the algorithm from being used as a “propaganda tool by a foreign government,” suggesting American business leaders would ensure fair operation. Meanwhile, former President Donald Trump remarked he would make TikTok “100 percent MAGA,” while adding that “everyone’s going to be treated fairly.”
These statements inadvertently highlight a troubling implication: divestment doesn’t eliminate manipulation risks – it simply changes who controls the levers of influence. Actions framed as resisting foreign propaganda simultaneously normalize domestic manipulation as standard political practice.
Critics argue the United States has missed an opportunity to demonstrate soft power by creating transparent, trustworthy information systems that others would want to emulate. Instead, leaders have prioritized seizing a temporary advantage at the expense of establishing a more enduring source of legitimacy.
The greater concern is the increasing normalization of social media as a weapon of influence, despite widespread knowledge that these platforms can be manipulated. Society’s growing dependence on these platforms as news sources makes their vulnerabilities increasingly dangerous, even for those who don’t directly use them.
Canada experienced its own digital media challenges with the Online News Act (C-18), which required platforms to compensate news outlets for sharing their content. When Meta responded by banning news on Facebook and Instagram in Canada, engagement with news content dropped 85 percent, potentially weakening rather than strengthening Canadian journalism.
Addressing these challenges requires focusing on the fundamental power imbalance between platforms and users. This power includes surveillance capabilities that predict user preferences, algorithmic control over information exposure, and the ability to set platform rules that determine who gains influence.
Rather than negotiating with this power, as both Canada and the United States have attempted, experts suggest reducing vulnerabilities in our relationship with media platforms and supporting domestic journalism that can compete effectively. The ultimate goal should be making platforms less susceptible to manipulation by any actor, foreign or domestic.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


11 Comments
The TikTok case raises important questions about the role of technology in shaping our information landscape. While profit distribution is relevant, I’m more concerned about the deeper implications for democratic integrity. Focusing solely on commercial interests seems short-sighted.
This is a challenging issue without easy answers. On one hand, the potential for foreign manipulation through social media platforms is a serious threat that needs to be addressed. On the other, any solution needs to balance security concerns with commercial realities. It’s a delicate balance.
The TikTok situation highlights the delicate balance between commercial interests and national security concerns. I appreciate the nuance in this analysis – it’s not a simple black-and-white issue. We need to carefully weigh the tradeoffs and potential unintended consequences.
Absolutely. Addressing manipulation risks should be the top priority, even if it means less favorable commercial terms. The long-term implications for our information ecosystem are far more important.
The TikTok situation highlights the complex interplay between technology, business, and national security. While the profit-sharing aspect is important, I agree that the manipulation and influence concerns should be the top priority. Curious to see how this plays out in the coming months.
This is a fascinating case that speaks to the broader challenges of regulating global tech platforms. I appreciate the nuanced analysis that acknowledges the complexities involved. Addressing manipulation risks should be the top priority, even if it means less favorable commercial terms.
This TikTok saga seems like a complex geopolitical chess match. Profit distribution is important, but the deeper concerns around manipulation and foreign influence shouldn’t be ignored. I’m curious to see how this unfolds and whether the American ownership will truly address those underlying issues.
Agreed. While the profit-sharing aspect is certainly relevant, the broader security implications are what really matter here. Ensuring the integrity of our information landscape is critical for democratic societies.
The TikTok saga highlights the tensions between economic interests and national security concerns. While the profit-sharing aspect is not irrelevant, I’m more worried about the potential for foreign influence and the integrity of our information ecosystem. This is a complex issue that deserves careful consideration.
This is a fascinating case study in the geopolitics of technology and social media. I’m skeptical that a change in ownership alone will resolve the deeper concerns around foreign influence and data security. The devil is likely in the details of any potential deal.
Well said. Focusing solely on profit distribution feels like a missed opportunity to truly address the core issues. We need a more holistic approach that prioritizes democratic integrity over commercial interests.