Listen to the article
As a new disturbing online trend called “Red Dolphin” emerges among teenagers, experts warn it bears striking similarities to the infamous “Blue Whale” challenge that first appeared in 2016. This latest development has raised fresh concerns about digital manipulation targeting vulnerable adolescents.
In Almaty, the recent suicide of two teenage girls has reignited fears about online death games. One victim was the daughter of prominent environmentalist Evgeny Mukhamedzhanov, head of the ECO Network project, and artist Balkhia Mukhamedzhanova – a bright young woman from a stable family background.
The tragedy has fueled speculation about the “Red Dolphin” challenge, widely considered a successor to the “Blue Whale” phenomenon that gained notoriety nearly a decade ago. These challenges typically involve “curators” who issue increasingly disturbing tasks to participants via social media, designed to control behavior, inflict psychological harm, and ultimately push vulnerable teenagers toward self-destruction.
The original “Blue Whale” game traced back to 2016 when a Russian teenager named Renata Kambolina (known online as Rina Palenkova) posted a farewell selfie on VKontakte before taking her life. Her image became tragically iconic among depressed teenagers and was even commercialized by unscrupulous entrepreneurs who created merchandise featuring her final message: “Nya. Bye.”
The mechanics of these challenges follow a methodical pattern. Participants typically signal interest by posting specific hashtags like #IamInGame or #BlueWhale, after which they’re contacted by a “curator” who explains the rules. Tasks begin at 4:20 AM daily and require photo or video evidence of completion. Initial assignments involve psychological pressure like watching horror movies or listening to depressing music, gradually escalating to self-harm, dangerous activities that diminish fear response, and ultimately, suicide.
“These challenges use techniques similar to cult programming,” explains a child psychologist who requested anonymity. “Sleep deprivation, isolation, rewards for compliance and punishment for refusal – all designed to suppress free will while creating the illusion of choice.”
The media’s role in amplifying these phenomena has drawn criticism. After Russian newspaper Novaya Gazeta published an exposé on “death groups” in 2016, interest in the challenge exploded. The resulting “Streisand effect” – where attempts to suppress information lead to wider distribution – helped the concept spread internationally.
Philipp Budeykin, known as “Philipp Lis” online, was arrested in Russia in November 2016 for his role in administering such groups. Though investigated for potential involvement in over 15 teenage deaths, he was ultimately sentenced to 3.4 years in prison for driving two girls to attempted suicide. After his release, Budeykin blamed journalists for sensationalizing the story.
Another administrator, Alexander Glazov, was prosecuted in 2017-2018. In a bizarre twist, after his detention, Glazov joined the Wagner Group to fight in Ukraine and now, despite having previously endangered schoolchildren, conducts patriotic lectures for Russian students as a war veteran.
The international impact has been profound. Similar cases have been reported in India, China, Egypt, Iran, Kenya, Paraguay, Tunisia, and Spain. In the Basque Country, prosecutors investigated a minor’s suicide linked to both the “Blue Whale challenge” and an international suicide pact organized by a teenager in Argentina.
Three months after losing her daughter, Balkhiya Mukhamedzhanova channeled her grief into an art exhibition titled “The Whirlpool,” featuring poignant portraits of her daughter Aisha. Behind each delicate stroke lies a mother’s anguish and quest for answers about what drives teenagers to such desperate acts.
Digital safety experts recommend a multilayered approach to protection. “Prevention requires developing digital literacy, teaching stress resilience, and fostering emotional intelligence,” says Dr. Elena Mikhailova, child psychologist at the Institute for Digital Wellbeing. “Supervision matters, but without trust, it becomes a barrier rather than protection.”
Technology companies are increasingly employing artificial intelligence to identify dangerous content. Advanced algorithms analyze context, emotional tone, and behavioral patterns to detect users who might need intervention.
As “Red Dolphin” generates headlines similar to its predecessor, mental health advocates urge a more thoughtful response than the moral panic that surrounded “Blue Whale.” The real challenge, they suggest, isn’t just combating these games but addressing the underlying issues of teenage isolation, digital media literacy, and mental health support.
“The question isn’t whether death groups exist,” reflects Mikhailova, “but how society’s reaction – from media sensationalism to political exploitation – sometimes amplifies rather than mitigates the harm. Every panic-inducing headline potentially contributes to the problem we’re trying to solve.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


22 Comments
The tragic loss of these young lives is heartbreaking. More research is needed to fully understand the psychology and mechanics behind these challenges, so effective prevention and intervention strategies can be developed.
Absolutely, we must find ways to reach and support vulnerable teens before they become ensnared in these dangerous online traps.
As a society, we have a moral obligation to protect our children. These online challenges are a symptom of deeper problems that demand our urgent attention and action.
The fact that these challenges keep evolving to target new generations is extremely worrying. Proactive, collaborative action is needed to shut down and prevent the spread of such deadly content.
The similarities between the ‘Red Dolphin’ and ‘Blue Whale’ trends are deeply unsettling. Clearly, the predators behind these games are exploiting the same vulnerabilities in new ways.
Absolutely. We need a multi-faceted approach to disrupt these manipulative tactics and support young people in developing healthy online habits.
The tragic loss of these young lives is a devastating reminder of the very real dangers that exist in the digital realm. We must do more to safeguard our children.
This is a deeply concerning trend. Social media platforms must do more to protect vulnerable youth from these predatory online ‘challenges’. Monitoring and shutting down these toxic communities should be a top priority.
I agree, the safety and wellbeing of young people needs to come first. These online death games are a disturbing and unacceptable risk.
This is a chilling development that demands immediate action. Social media platforms, policymakers, and mental health professionals must work together to protect vulnerable youth.
Agreed. Comprehensive, collaborative solutions are needed to combat these emerging online threats and prevent further tragedies.
As a parent, this news fills me with deep concern. We need better digital literacy programs to empower young people to recognize and avoid these manipulative tactics.
Agreed, teaching critical thinking skills around social media use is crucial. Families, schools, and platforms all have a role to play in protecting youth.
This is a truly devastating situation. My heart goes out to the families and communities impacted by these tragic losses. We must do everything in our power to protect vulnerable youth.
As a concerned citizen, I hope the authorities can quickly investigate the ‘Red Dolphin’ challenge and dismantle any organized efforts behind it. Protecting young lives has to be the top priority.
As a parent, I’m deeply concerned about the rise of these deadly online trends. We need a comprehensive, multi-stakeholder approach to combat the spread of such manipulative content.
I agree wholeheartedly. The safety and wellbeing of our children must be the top priority for all of us, both online and in the real world.
These online ‘games’ are a disturbing example of how digital spaces can be weaponized to harm the most vulnerable members of our society. Urgent action is clearly needed.
This is a complex issue with no easy solutions. However, the loss of young lives is unacceptable. Increased resources for mental health support and crisis intervention could make a real difference.
That’s an excellent point. Addressing the underlying issues of depression, anxiety, and suicidal ideation among youth has to be part of the solution.
The details of this case are truly heart-wrenching. We must redouble our efforts to educate young people about the dangers of online manipulation and provide them with the support they need.
Absolutely. Early intervention and mental health resources could make all the difference in preventing vulnerable teens from falling victim to these predatory challenges.