Listen to the article
An investigation by The Guardian has revealed that Substack, a popular online publishing platform, has been profiting from subscription-based newsletters that spread Nazi propaganda, white supremacist ideology, and antisemitic content.
The platform, which hosts approximately 50 million users worldwide, operates on a business model where it takes roughly 10% of revenue generated from paid subscriptions. With about 5 million subscribers paying for premium content across the network, the investigation raises serious concerns about the company’s content moderation policies and revenue sources.
As part of its investigation, The Guardian reporters subscribed to a newsletter operating under the name “NatSocToday,” a clear reference to National Socialism—the ideology of Nazi Germany. This account charged subscribers $80 annually and described itself as “a newsletter featuring opinions and news important to the National Socialist and White Nationalist Community.” A subsequent review by The Jerusalem Post confirmed the publication contained Nazi imagery and content praising Adolf Hitler.
One particularly disturbing post on the “NatSocToday” newsletter suggested that Jews were responsible for starting World War II and described Hitler as “one of the greatest men of all time.”
The investigation revealed a concerning pattern of how extremist content spreads on the platform. Within just two hours of creating an investigative account, Substack’s recommendation algorithm directed Guardian reporters to 21 additional profiles featuring similar extremist content. Several of these accounts actively promoted each other’s material and had accumulated thousands of followers.
Among the profiles identified was one operated by a self-described “national socialist activist” who charged subscribers for access to content glorifying Hitler. Another UK-based account reportedly displayed Nazi imagery and published material engaging in Holocaust denial, directly contradicting established historical facts about the systematic murder of approximately six million Jews by Nazi Germany and its collaborators.
Substack’s published moderation policy states that the platform “cannot be used to publish content or fund initiatives that incite violence based on protected classes,” including religion and ethnicity. However, the newsletters identified in the investigation, which remained active as of February 15, promoted Nazi ideology and antisemitic conspiracy theories while carefully avoiding explicit calls for violence—apparently navigating within the platform’s stated guidelines.
The findings come amid growing concerns about rising antisemitism and online radicalization. Danny Stone, Chief Executive of the Antisemitism Policy Trust, a UK-based policy advocacy NGO, spoke to The Guardian about the risks of unchecked online extremism.
“People can be, and are, inspired by online harm to cause harm in the real world,” Stone told reporters. “The terrorist who attacked Heaton Park synagogue didn’t wake up one morning and decide to kill Jews; he will have been radicalized,” he added, referencing a deadly attack on a Jewish synagogue in Manchester, UK, last year.
Stone emphasized the need for more comprehensive regulation of harmful online content, noting that “algorithmic prompts and the amplification of harmful materials are extremely serious.” He added that while the UK’s Online Safety Act was intended to address illegal content, “very little is being done about so-called legal but harmful content.”
Following the publication of these findings, a spokesperson for the Holocaust Educational Trust expressed outrage, telling The Guardian: “Material like this that spreads conspiracy theories and Holocaust denial and which praises Hitler and the Nazis is not new, but clearly its reach is increasing. The idea that Substack profits from this hateful material and allows for it to be boosted via their algorithm is a disgrace.”
When approached by The Guardian, Substack did not respond to requests for comment. However, Hamish McKenzie, co-founder of Substack, had previously addressed the platform’s moderation decisions in a post on the site.
“I just want to make it clear that we don’t like Nazis either – we wish no one held those views,” McKenzie wrote. “But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetising publications) makes the problem go away – in fact, it makes it worse.”
McKenzie defended the platform’s approach, stating: “We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power. We are committed to upholding and protecting freedom of expression, even when it hurts.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


9 Comments
Shocking revelations. Substack must take swift action to address this issue and prevent its platform from being exploited by extremists and hate groups. Free speech is important, but not at the cost of amplifying dangerous ideologies.
Absolutely. Content moderation is critical, especially for platforms with large user bases. This is a serious breach of trust that Substack needs to rectify.
This is a deeply disturbing report. Substack must investigate these allegations thoroughly and take appropriate steps to remove any content associated with Nazism, antisemitism, or other forms of hate speech from its platform.
Agreed. Profiting from the spread of hateful ideologies is unacceptable. Substack needs to prioritize trust, safety, and ethical business practices.
This is a shocking revelation. Substack’s failure to prevent the use of its platform for promoting Nazi propaganda and antisemitism is deeply troubling. They must take immediate action to address this situation.
This is extremely concerning. Substack should be held accountable for profiting from hateful and dangerous content. They need to improve content moderation and remove any accounts linked to extremist ideologies.
Agreed. Platforms like Substack have a responsibility to ensure their services are not being used to spread harmful misinformation and propaganda.
Extremely concerning that Substack has been generating revenue from newsletters tied to Nazi content and antisemitism. This is a serious breach of responsibility and the company must act swiftly to address these issues.
Absolutely. Substack needs to review its content policies and strengthen its moderation efforts to prevent the further spread of hateful ideologies on its platform.