Listen to the article

0:00
0:00

Children as young as eight have been exposed to Nazi glorification and jihadist propaganda on Roblox, according to a troubling investigation by online safety experts.

The popular gaming platform, which attracts more than 66 million daily users worldwide, has become a breeding ground for extremist content despite its predominantly young user base. Researchers discovered numerous instances where games on the platform featured Nazi symbols, recreations of terrorist attacks, and content promoting radical ideologies.

The Centre for Countering Digital Hate (CCDH), which conducted the investigation, identified more than 300 Roblox experiences that contained concerning extremist content. These ranged from digital recreations of the 2019 Christchurch mosque shootings to games allowing players to roleplay as Nazi soldiers or ISIS fighters.

“What we found was deeply disturbing,” said Imran Ahmed, chief executive of the CCDH. “Roblox presents itself as a safe space for children, but our research reveals it’s harboring content that glorifies terrorism, promotes far-right extremism, and exposes impressionable young minds to radicalization.”

The investigation highlights significant gaps in Roblox’s content moderation systems. Despite the platform’s stated commitment to safety, researchers were able to find games with titles explicitly referencing terrorist groups and Nazi symbolism. Some games had been active for months and had amassed thousands of visits before being reported.

Child safety experts have expressed alarm at these findings, noting that young users are particularly vulnerable to such content. Dr. Samantha Knight, a child psychologist specializing in digital media, explained that “children in this age group are still developing critical thinking skills and can struggle to differentiate between fantasy and reality, making them susceptible to extremist messaging disguised as gaming content.”

Roblox has responded to the investigation by removing the flagged content and issuing a statement reaffirming its commitment to user safety. “We have zero tolerance for content that promotes terrorism, violence, or extremist ideologies,” a company spokesperson said. “We employ both automated systems and human moderators to identify and remove inappropriate content, but we acknowledge that some material may slip through these safeguards.”

The platform, valued at over $25 billion, has faced increasing scrutiny over its content moderation practices as its popularity has surged during the pandemic. Roblox allows users to create and share their own games, which presents unique challenges for monitoring the vast amount of user-generated content uploaded daily.

Regulatory bodies are taking notice. The Online Safety Bill currently making its way through UK Parliament would impose stricter requirements on platforms like Roblox to protect children from harmful content. Similar legislative efforts are underway in the European Union, Australia, and parts of the United States.

Parents’ groups have called for more transparency from Roblox about how it identifies and removes dangerous content. Sarah Thompson, founder of Parents for Digital Safety, urged more proactive measures: “Waiting for users to report problematic content isn’t enough. Platforms that market themselves to children need to invest significantly more in preventing this material from appearing in the first place.”

Industry analysts note that Roblox faces a difficult balancing act. “The platform’s success is built on user creativity and freedom, but that openness creates vulnerabilities,” explained Marcus Chen, a digital policy researcher at the Technology Ethics Institute. “Implementing stricter controls without stifling the creative ecosystem that makes Roblox appealing is a complex challenge.”

For parents, experts recommend maintaining open communication with children about their online experiences, using parental controls, and regularly checking what games children are playing on the platform.

The CCDH has called for Roblox to publish quarterly transparency reports detailing the volume of extremist content identified and removed, increase investment in human moderation for content aimed at younger users, and implement more sophisticated automated detection systems.

As online gaming platforms continue to grow in popularity among children, the incident underscores broader concerns about how digital spaces are regulated and monitored to ensure young users’ safety.

Verify This Yourself

Use these professional tools to fact-check and investigate claims independently

Reverse Image Search

Check if this image has been used elsewhere or in different contexts

Ask Our AI About This Claim

Get instant answers with web-powered AI analysis

👋 Hi! I can help you understand this fact-check better. Ask me anything about this claim, related context, or how to verify similar content.

Related Fact-Checks

See what other fact-checkers have said about similar claims

Loading fact-checks...

Want More Verification Tools?

Access our full suite of professional disinformation monitoring and investigation tools

28 Comments

  1. Linda U. Johnson on

    Interesting update on Children Under Nine Exposed to Extremist Propaganda on Roblox, Report Finds. Curious how the grades will trend next quarter.

  2. Interesting update on Children Under Nine Exposed to Extremist Propaganda on Roblox, Report Finds. Curious how the grades will trend next quarter.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved. Designed By Sawah Solutions.