Listen to the article
“When will it end, when will it stop?” Debbie Duncan asks, her voice weary after 18 months of relentless online abuse following the death of her son, Jay Slater.
The Lancashire mother has been targeted with horrific messages since her son’s disappearance and death in Tenerife last year. Reading from her phone, she shares some of the most disturbing comments: “Debbie deserved to lose her son,” “How can you possibly respect a mother grifting off her own son’s death?” and “How do you know it’s Jay in that coffin – he needs digging up.”
“Just scroll, scroll and scroll, and they are still there,” she says. “Jay’s just been dehumanised.”
Hundreds of millions of pieces of content about Jay Slater continue to circulate online, persisting even after a coroner ruled his death in a remote Tenerife ravine was a tragic accident. The barrage of conspiracy theories, misinformation and abuse has been overwhelming.
“I don’t think I’d be here if I sat every day and read everything that was being said,” Debbie admits.
Her experience has driven her to campaign for legislative change to hold social media platforms accountable for harmful content. Despite numerous attempts to have misinformation and abusive content removed, she’s found little success.
“We just want to have some legislation around content,” she explains. “It’s about the platforms having that responsibility to take down the misinformation, the harassment, bullying.”
Jay’s case has become emblematic of a disturbing trend in online behavior, with his family becoming victims of what experts call “tragedy trolling” at unprecedented levels. The campaign for greater regulation is backed by the charity Missing People, which works with a growing number of families experiencing similar abuse.
“It feels quite out of control,” says Josie Allan from Missing People. “We know with the development of AI, there’s going to be even more complicated issues. People are creating fake news about missing cases, potentially making fake content using missing people’s faces or voices. We know that would be horrendous for families to see and could really misdirect police resources and investigations.”
This emerging problem has already manifested in disturbing ways. In Australia, the search for missing four-year-old Gus Lamont was recently disrupted by a fake AI image showing a man supposedly carrying the child. Anonymous accounts increasingly create bogus missing people posts, often linked to scam websites, exploiting public goodwill.
Kevin Gosden, whose son Andrew has been missing for 18 years since disappearing in 2007 at age 14, has experienced similar ordeals with online misinformation.
“Just before the 18th anniversary of Andrew’s disappearance, we suddenly became aware of articles online claiming that Andrew’s body had been found, that his DNA had been found somewhere, that police had been concealing CCTV footage with him,” Gosden says. “Utter nonsense. None of that’s true.”
The false information takes a severe psychological toll. “I have been very close to what I know is going to be a breakdown again a number of times,” Gosden reveals. “If you’ve got a lot of false information out there, it just doesn’t help find the lad we love.”
Experts point to monetization as a key driver of this harmful content. At the CrimeCon conference in Manchester, Andy Hobbs, who sells murder mystery games, observed: “Unfortunately, views means more money. And until that gets looked at, I don’t think any regulation will come in. It’s in the interest of social media networks to get more views, more hits.”
Law enforcement officials are increasingly concerned about the impact on investigations. Assistant Chief Constable Damien Miller, the national policing lead for missing persons, explained how misinformation hampers police work.
“It takes policing away from those inquiries that we need to be following, because it’s fake, it’s false information,” Miller says. “It’s hurtful, it is harmful to the families, but it’s also then misdirecting police investigations.”
Forensic psychologist Kerry Daynes is even more direct: “Some amateur sleuths are very dangerous. Because people want their 15 minutes of fame on social media, we can’t do things that knowingly hamper police investigations. In any other world that would result in a prosecution.”
Back in Lancashire, Debbie Duncan keeps boxes of cards and gifts in Jay’s bedroom – reminders of the compassion and support they’ve received alongside the hatred. She’s determined to prevent other families from experiencing similar trauma through her petition to Parliament.
Looking at photos of her son, she says with determination, “It’s torture. I just look at his picture and if it’s the last thing I can do for you Jay – I’ll try my blimmin’ hardest.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
Losing a child is devastating enough without the added torment of online harassment. I hope this family is able to find some peace and privacy as they grieve.
Absolutely. The lack of empathy and respect shown by some online commenters is shocking. We must do better as a society to support those dealing with tragedy.
The persistent online abuse and conspiracy theories in this case are deeply disturbing. Families dealing with loss deserve empathy, not further trauma from strangers online.
Calling for legislative change to address this issue is a reasonable response. Platforms must do more to moderate content and protect vulnerable users.
This is a tragic case that highlights the need for stronger regulations around online content moderation. Families in grief should not have to endure such relentless abuse.
This is a heartbreaking situation. While transparency is important, the overwhelming volume of speculation and cruelty directed at this family is unacceptable. We need better online safeguards.
This is a tragic story. Online speculation and abuse can compound the pain for grieving families. We should be more mindful and compassionate when discussing sensitive cases like this.
Agreed. Social media platforms need to be held accountable for allowing the spread of harmful misinformation and harassment, especially around sensitive issues.