Listen to the article

0:00
0:00

In a digital landscape where social media shapes public discourse, questions about content moderation and platform responsibility continue to intensify. Recent discussions with Inevitable West, a prominent figure on X (formerly Twitter), have highlighted ongoing concerns about religious bias and misinformation on the platform.

When confronted about their controversial religious content, Inevitable West defended their position, stating they would apply the same standards to all religions. Perhaps more troublingly, they confirmed they would never remove their own posts, even when proven false—a stance that raises serious concerns given their global reach and influence.

This revelation comes amid broader questions about X’s content moderation practices, both before and after Elon Musk’s high-profile acquisition. Critics have long accused the platform of bias in its enforcement actions, while others have questioned whether pre-Musk Twitter restricted free expression.

These concerns were explored in depth during a 2023 Panorama investigation, where former Twitter employees expressed significant apprehension about the platform’s ability to protect users. Their primary concerns centered around defending users from coordinated trolling campaigns, state-sponsored disinformation operations, and child sexual exploitation content—issues they directly attributed to the company’s extensive workforce reductions.

When approached for comment during that investigation, X did not respond to the specific allegations. However, Musk later reacted on the platform itself, sarcastically tweeting about the Panorama episode: “Sorry for turning Twitter from nurturing paradise into a place that has… trolls.” He followed this with the casual observation that “trolls are kinda fun,” seemingly dismissing the serious nature of the concerns raised.

Musk has previously defended the sweeping staff reductions, claiming that financial losses left him with “no choice” but to significantly downsize the company’s workforce. These cuts came during a tumultuous period of transition as Twitter transformed into X under his leadership.

Lisa Jennings Young, who served as head of content design at Twitter until departing in 2022, offers a sobering perspective on the current state of the platform and social media broadly. “I feel like we’re all living through a vast social experiment [on humanity],” she notes, emphasizing that this experiment lacks clear parameters or objectives.

What makes the situation particularly concerning, according to Jennings Young, is that “it is not a controlled social science experiment [but one] we’re all a part of.” This unplanned, ungoverned approach to platform management means that no one—not users, not experts, and perhaps not even platform leadership—can predict the ultimate impact these moderation policies will have on public discourse and society.

The ongoing tension between free expression and responsible content moderation reflects a fundamental challenge facing all social media platforms. However, X’s approach under Musk has drawn particular scrutiny due to the platform’s outsized influence on political discourse, breaking news, and global conversations.

As users continue to navigate this evolving digital environment, questions persist about how platforms should balance competing priorities: protecting vulnerable users from harm, preventing the spread of dangerous misinformation, and preserving open dialogue. Without transparent policies and consistent enforcement, critics argue that platforms risk becoming amplifiers of the internet’s most problematic voices.

The case of Inevitable West serves as just one example of how content creators on X can reach global audiences with little accountability for accuracy or impact—a microcosm of the larger questions facing social media governance in the modern era.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

11 Comments

  1. While Musk’s vision for transforming X is ambitious, the concerns around religious bias, misinformation, and content moderation must be taken seriously. The platform’s global reach means its policies and practices have far-reaching implications for public discourse.

    • Well said. Striking the right balance between free expression and responsible content curation is no easy task, but it’s essential for maintaining the integrity and trust of social media platforms.

  2. It’s concerning to see the platform’s stance on religious bias and misinformation, especially given its global reach and influence. Maintaining a healthy, balanced, and trustworthy social media environment should be a top priority for X’s leadership.

    • Absolutely. The platform’s unwillingness to remove proven false content is a worrying sign and raises significant questions about its commitment to responsible content curation.

  3. Interesting to see how X (formerly Twitter) continues to evolve under Musk’s leadership. While content moderation is a complex challenge, the platform’s stance on religious bias and misinformation raises valid concerns. Curious to see how this plays out and impacts the broader social media landscape.

    • Agreed, the platform’s unwillingness to remove proven false content is particularly troubling. Transparency and accountability around content moderation will be critical moving forward.

  4. William Martinez on

    The transformation of X under Musk’s leadership is certainly a significant development in the social media landscape. While his vision may be ambitious, the platform’s approach to content moderation and misinformation will be crucial in determining its long-term impact and influence.

  5. Amelia Thompson on

    This is an important and timely discussion. As social media platforms continue to shape public discourse, the need for clear, consistent, and accountable content moderation policies is paramount. The concerns raised here deserve serious consideration by X’s leadership and the industry as a whole.

  6. Elijah Martinez on

    The dynamics at play here are complex, with valid arguments on both sides of the content moderation debate. However, the unwillingness to remove proven false content is a troubling position that could have far-reaching consequences for X and the broader social media ecosystem.

  7. Elizabeth G. Jackson on

    The revelations about X’s content moderation practices, both pre- and post-Musk acquisition, are quite concerning. Protecting users and upholding journalistic integrity should be paramount for any social media platform of this scale and influence.

    • Absolutely. The Panorama investigation highlights the gravity of these issues and the need for robust, independent oversight to ensure platforms like X are serving the public interest.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.