Listen to the article
The spread of far-right ideologies across online spaces culminated in violent riots across the UK last year, according to a comprehensive investigation that analyzed thousands of posts within Facebook groups where extremist content thrives.
The summer 2024 riots, primarily targeting asylum seekers and Muslims, shocked many observers due to their intensity and the profile of participants. Most rioters were local residents with no formal affiliation to established far-right organizations. Some even rejected the “far-right” label entirely, carrying banners declaring: “We’re not far-right, we’re just right.”
Analysis revealed a digital ecosystem bound together by deep distrust of government institutions, anti-immigrant sentiment, nativism, conspiracy theories, and misinformation. Experts consulted during the investigation confirmed that many posts contained explicitly far-right and extremist content that could contribute to radicalization processes.
The research identified five key themes driving discourse within these online communities. First was a profound distrust of mainstream institutions, with approximately 40% of sampled posts containing anti-establishment rhetoric. Politicians were routinely labeled as “traitors” and “scum,” while police, judiciary, and media were portrayed as corrupt or compromised. Even the Royal National Lifeboat Institution was vilified as a “taxi service” for “illegals.”
Anki Deo of advocacy group Hope Not Hate identified Reform UK as the main beneficiary of this institutional distrust. The party operates within a wider ecosystem including GB News, social media influencers, and far-right groups, all driving anti-establishment messaging.
Cambridge University professor Sander van der Linden noted that undermining democratic institutions echoes fascist tactics throughout history. “Generally, fascists and extremists are trying to undermine institutions of truth, facts and education because that is what’s standing in their way – an informed citizenry,” he explained, adding that many people engaging with such content are unaware they’re participating in extremist discourse.
The second major theme was scapegoating immigrants, with approximately one in seven posts focused on immigration issues. About 10% contained dehumanizing, generalizing, or overtly racist language. Posts often portrayed migrants as dangerous, deceitful, or culturally incompatible, using coded language about “military-aged men” and “grooming gangs” to imply inherent threats.
Radicalisation researcher Dr. Julia Ebner explained that this systematic demonization of immigrants plays a crucial role in radicalization processes. When combined with other extremist ideologies, such discourse can create a “toxic cocktail” resulting in riots or targeted violence.
The third theme centered on portraying white British people as victims, with about 4% of posts claiming “indigenous,” “British,” “white,” or “Christian” identities were under threat. Van der Linden explained that repeated exposure to such messaging creates an “illusory truth” effect where claims begin to feel factual simply through repetition and social reinforcement.
The fourth theme involved defending or rationalizing the 2024 riots, with hundreds of posts supporting what users viewed as legitimate protests or defending “freedom of speech” for those charged with online offenses. This included support for individuals like Lucy Connolly, jailed for 31 months after calling for hotels housing asylum seekers to be burned.
Finally, the groups served as entry points for deeper conspiracy theories. About 5% of posts contained conspiracy content, including climate crisis denial and “replacement theory” claiming shadowy global elites were orchestrating demographic changes. These theories gained traction in an environment where confirmation bias was already prevalent.
What makes these online spaces particularly concerning is their amplification effect. Ebner noted that while many narratives mirror historical far-right rhetoric, algorithmic amplification accelerates radicalization at unprecedented speeds. Van der Linden added that social media allows instant connection with thousands of like-minded individuals, distorting perceptions about societal consensus.
The investigation found that Meta’s recent reversal of certain content takedown policies may have emboldened users to share more extreme content without fear of consequences. While not all users demonstrated the same degree of radicalization, experts identified clear patterns of extremist ideology throughout the sampled content.
As Deo observed, “The line of what makes someone a member of the far right is increasingly blurred: whereas previously someone might have had to join an organisation, now people can participate or just observe through online groups, dipping in and out.”
The combined membership of these interconnected Facebook groups exceeded 600,000 accounts as of July 2025, though this likely includes significant overlap of individuals belonging to multiple groups.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

