Listen to the article
Minnesota Delays Enforcement of Social Media Transparency Law Amid Legal Challenge
Minnesota officials have agreed to postpone enforcement of a controversial social media transparency law against major tech companies until at least early 2024, according to court documents filed Friday. The agreement comes as tech industry group NetChoice pursues a legal challenge against the statute on constitutional grounds.
The Prohibiting Social Media Manipulation Act, which passed the Minnesota legislature last year, would require social media platforms to publicly disclose details about their recommendation algorithms and content evaluation processes. The law also mandates that companies reveal how they “impose limits on user engagement” – a provision that could force platforms to explain how they moderate content and restrict certain types of interactions.
NetChoice, whose membership includes tech giants Google, Meta, and Snap, filed a lawsuit in July challenging the constitutionality of the measure. The industry group argues that the law’s disclosure requirements violate First Amendment protections for its member companies.
According to the joint stipulation filed with U.S. District Court Judge Nancy Brasel, NetChoice will submit a motion for an injunction to block enforcement by November 21. Minnesota Attorney General Keith Ellison’s office will respond by December 19, with NetChoice’s counter-response due by January 23.
Though the law was originally scheduled to take effect in July 2023, Ellison’s office has committed not to pursue enforcement actions against NetChoice members until after the court rules on the injunction request. This effectively delays any potential enforcement until well into 2024.
At the heart of the legal dispute is the question of whether algorithmic recommendation systems should receive the same First Amendment protections as traditional editorial decisions. NetChoice argues they should, claiming in its complaint that “Minnesota cannot compel disclosure of these protected editorial algorithms any more than it could compel a newspaper to reveal its editorial decision-making process.”
The industry group further contends that these algorithms constitute valuable trade secrets, the forced disclosure of which would harm their members’ competitive positions.
This Minnesota case represents just one front in a broader national debate over social media regulation and algorithmic transparency. Several states have pursued similar legislation aimed at forcing tech companies to provide more information about how their platforms function, particularly regarding content recommendations that might affect young users or spread misinformation.
The challenge comes amid growing scrutiny of social media platforms’ influence on public discourse, mental health, and democratic processes. Critics argue that greater transparency is necessary to understand how these systems may amplify certain content or viewpoints over others, while tech companies maintain that their algorithms are proprietary technology protected from government interference.
The outcome of this case could significantly impact how states approach social media regulation going forward. If NetChoice succeeds in blocking the Minnesota law, it may establish precedent limiting similar transparency requirements in other jurisdictions. Conversely, if the court upholds the statute, it could embolden other states to enact comparable measures.
For advertisers and marketers who rely on these platforms, the case has particular significance. Any requirements forcing platforms to disclose algorithmic details could provide valuable insights into how content is prioritized and presented to users, potentially affecting advertising strategies and content optimization approaches.
As both sides prepare their legal arguments in the coming months, the technology industry and regulatory advocates will be watching closely to see how the court balances free speech protections against growing calls for algorithmic accountability.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


6 Comments
Delaying enforcement of this law until 2025 feels like a long time to wait. I wonder if there’s a way to implement some interim disclosure requirements to give the public more insight in the meantime.
That’s a good point. Perhaps a phased approach with some initial disclosures could work, while the broader legal issues are resolved.
I’m glad to see Minnesota taking steps to hold social media companies more accountable, even if the enforcement timeline is lengthy. These platforms have a major influence on public discourse and need better oversight.
As someone who follows the tech industry, I’m curious to see how this plays out. Transparency around algorithms and content moderation is important, but the legal arguments around free speech need to be carefully considered too.
This law seems like a positive step, but I’m skeptical that delaying enforcement until 2025 is the right approach. The public deserves to know more about how these influential platforms operate in the near-term.
This law seems like a reasonable attempt to increase transparency around social media algorithms and content moderation. While it may face legal challenges, it’s important to understand how these platforms operate and their impacts on users.