Listen to the article
Social Media Algorithms Shaping European Political Views, Survey Reveals
Europe’s digital information landscape is increasingly dominated by platforms like TikTok, X (formerly Twitter), and YouTube—services that were never designed with democratic resilience in mind. While a brief TikTok ban in the United States earlier this year sparked debates about national security risks, more immediate concerns are emerging closer to home in Europe.
In Germany, recent research indicates recommendation algorithms are directing new users toward far-right content, while X has faced criticism after owner Elon Musk openly endorsed the far-right Alternative for Germany (AfD) party. Though the influence of social media on user perceptions has long been suspected, the specific mechanisms and extent to which these algorithms systematically shift beliefs, particularly around disinformation, remain poorly understood.
A nationwide survey conducted in Germany at the beginning of 2024, commissioned by the Friedrich Naumann Foundation for Freedom, attempts to shed light on this relationship. The findings reveal that users of social media—especially TikTok, X, and YouTube—are significantly more likely to accept false or misleading narratives.
Among German TikTok users, the survey found concerning levels of support for propaganda-friendly positions. For instance, many endorsed the view that Russia is fighting against a “fascist regime” in Ukraine, or rejected the characterization of China as a dictatorship. Perhaps most alarming, 42% of TikTok users surveyed expressed the belief that authoritarian systems like China’s are more effective than democracy.
The data reveals a pronounced generational divide in perceptions. Nearly two-thirds of respondents over 60 fully agree that China is a dictatorship, compared to just one-third of those aged 16-30. Approximately one in five young adults explicitly reject the characterization of China as dictatorial. Similar patterns emerge regarding Russia’s war in Ukraine. While most Germans recognize it as an illegal war of aggression and support Western assistance, almost 20% believe Moscow has a greater interest in peace than Western nations—with higher percentages among those under 45.
Media consumption habits strongly correlate with these viewpoints. Only 28% of TikTok users fully agree China is a dictatorship, compared to approximately half of the overall sample. Meanwhile, consumers of traditional media sources like public broadcasters (68.9%), local newspapers (68.6%), and national newspapers (67.7%) express much higher skepticism toward Beijing.
The survey also explored perceptions about the primary sources of disinformation. While 70% of respondents identified Russia as the leading purveyor of false information, TikTok users were significantly less likely than average to view China as a disinformation source (41.4% versus 59.2%). Interestingly, TikTok users were more likely to suspect Germany itself (34.3% versus 22.7%) and less likely to suspect China or Russia (50.2%) of spreading disinformation.
Many Germans report feeling ill-equipped to combat disinformation. Though more than half recognize it as a serious problem, approximately half acknowledge difficulty identifying false narratives. Younger people express more confidence in their ability to spot manipulation, despite showing signs of greater exposure and vulnerability to misleading content.
The European Union’s Digital Services Act (DSA) was designed to address these challenges by increasing platform transparency and accountability. However, major platforms continue to resist providing researchers with adequate data access to study how recommendation systems rank, amplify, and shape civic information consumption. Recent pressure from the U.S. administration against European regulatory frameworks has further complicated enforcement efforts.
This struggle has significant implications. Platform business models, leadership decisions, and algorithmic incentives leave their mark on public discourse. Under Elon Musk’s ownership, X has escalated confrontations with EU regulations such as the DSA and the General Data Protection Regulation (GDPR). Meanwhile, authoritarian regimes have grown adept at manipulating engagement-driven systems to spread their preferred narratives.
When a significant portion of younger users begin viewing authoritarian governance as “more effective” than democracy, the issue transcends content moderation to become an existential challenge to democratic institutions. Europe possesses regulatory tools to address these problems but must deploy them more effectively.
Disinformation functions both as a symptom and accelerant of deeper geopolitical and social shifts. To protect the integrity of Europe’s information ecosystem, the continent cannot outsource responsibility to private companies—particularly those vulnerable to exploitation by authoritarian regimes. Building democracies that can withstand today’s challenges requires accountable platforms, enforceable regulations, robust research capabilities, adaptive interventions, and citizens committed to protecting a shared reality.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


9 Comments
Interesting research on how social media shapes political views in Europe. The role of recommendation algorithms in driving users towards extremist content is concerning and deserves further scrutiny.
This is a worrying trend that deserves serious attention. The influence of social media on political views, especially among younger generations, is a threat to the integrity of democratic processes.
This survey highlights the outsized influence of social media on how people, especially youth, form their political opinions. Policymakers should investigate ways to promote digital literacy and media resilience to combat the spread of disinformation.
The findings of this survey are deeply troubling. The extent to which social media algorithms are shaping political views, particularly among young people, is a serious threat to democratic discourse.
While social media has democratized access to information, the potential for these platforms to amplify misinformation and extremist narratives is deeply concerning. Urgent action is needed to address this challenge.
While social media has democratized access to information, the algorithms driving these platforms can also amplify polarizing and misleading narratives. Greater oversight and user controls are needed to safeguard the public interest.
It’s worrying to see the potential for social media to drive users towards misinformation and extremist content. Platforms need to do more to address these challenges and protect the integrity of democratic discourse online.
This is a concerning trend that deserves urgent attention. The ability of social media platforms to amplify misinformation and extremist narratives is a significant challenge that requires a comprehensive response.
This is concerning – social media algorithms shaping political views, especially among young people, is a serious issue that needs more scrutiny. We need greater transparency and accountability around the algorithms used by these platforms.