Listen to the article
Meta’s fact-checking changes raise concerns about the spread of science misinformation across its platforms, according to public health experts. The social media giant announced on January 7 that it would discontinue its third-party fact-checking system in favor of a user-based “community notes” approach for flagging inaccurate or misleading content on Facebook and Instagram.
The shift has alarmed health communication specialists, including Harvard T.H. Chan School of Public Health’s K. Vish Viswanath, who warns that the new approach could lead to increased misinformation about science and health issues on platforms that reach billions of users worldwide.
“Whether community notes will work or not is something that’s worth independent evaluation,” said Viswanath, the Lee Kum Kee Professor of Health Communication at Harvard. While acknowledging that Meta’s previous fact-checking system wasn’t perfect, he expressed skepticism that eliminating professional fact-checkers would improve scientific accuracy across the platforms.
The change comes at a critical time when trust in scientific institutions remains fragile following the COVID-19 pandemic, during which social media platforms became significant vectors for health misinformation. Throughout the pandemic, Meta and other social media companies implemented various measures to combat false claims, including content labels for scientifically inaccurate information, reducing the visibility of misleading posts, and adding features to guide users toward reliable sources.
Health communication experts have documented the real-world consequences of online misinformation. “We know that exposure to misinformation can potentially lead to misbeliefs at variance with science,” Viswanath explained, pointing to vaccine hesitancy as a prime example where online falsehoods have translated to public health challenges.
Meta’s move away from professional fact-checking represents a significant shift in how the company approaches content moderation. The new “community notes” system appears to follow a similar model to Twitter’s (now X) approach, which relies on users rather than specialized fact-checkers to identify and contextualize misleading content.
Critics argue that this approach potentially off-loads responsibility for accuracy onto users who may lack expertise in complex scientific or medical topics. Without professional fact-checkers who understand the nuances of scientific research, users could be left to navigate contradictory claims without proper guidance.
Industry observers note that Meta’s decision comes amid broader debates about content moderation on social platforms and increasing pressure from various stakeholders about how tech companies should handle controversial or potentially harmful information.
To counter the potential flood of scientific misinformation, Viswanath offered several recommendations. He emphasized the need for proactive promotion of accurate scientific information by scientists, research institutions, and professional scientific societies. Additionally, he highlighted the vital role of local community-based and faith-based organizations in helping their communities build resilience against misinformation.
Journalists, particularly those working at the local level, also bear responsibility for accurately communicating scientific findings, according to Viswanath. Media literacy experts suggest this multi-layered approach—combining institutional messaging, community engagement, and responsible journalism—may help mitigate the effects of reduced professional fact-checking.
The implications of Meta’s policy change extend beyond individual health decisions to broader public health outcomes, potentially affecting vaccination rates, adherence to health guidelines during future outbreaks, and trust in medical science.
As Meta implements its new approach, health communication researchers will be watching closely to assess whether community-based moderation can effectively identify and contextualize complex scientific misinformation or if the absence of professional fact-checkers will further erode information quality on platforms used by billions worldwide.
Verify This Yourself
Use these professional tools to fact-check and investigate claims independently
Reverse Image Search
Check if this image has been used elsewhere or in different contexts
Ask Our AI About This Claim
Get instant answers with web-powered AI analysis
Related Fact-Checks
See what other fact-checkers have said about similar claims
Want More Verification Tools?
Access our full suite of professional disinformation monitoring and investigation tools


8 Comments
I’m curious to see how Meta’s new ‘community notes’ approach will work in practice. Crowdsourcing fact-checking has some potential benefits, but also risks of bias, manipulation, and the spread of misinformation. Rigorous testing and oversight will be crucial.
This is an important shift that could have significant implications for the quality of information on social media. While user-generated ‘community notes’ may offer a more dynamic approach, the lack of professional fact-checkers is concerning, especially for sensitive health and science topics.
I’m skeptical that ‘community notes’ will be an adequate replacement for professional fact-checking, at least in the short term. Social media platforms have struggled to effectively moderate user-generated content, and this change could exacerbate the spread of misinformation on important scientific issues.
I share the concerns expressed by health experts about the potential for increased misinformation with Meta’s fact-checking policy change. Maintaining public trust in science is critical, and relying on unvetted user input may undermine efforts to combat the spread of false claims.
Meta’s shift away from professional fact-checking is concerning. With trust in science still fragile after COVID, relying on user ‘community notes’ could enable the spread of misinformation, especially on critical health and science topics. Rigorous, independent fact-checking is vital.
This policy shift is concerning, as it could open the door to more pseudoscience and misinformation on Meta’s platforms. Rigorous, impartial fact-checking is essential, especially for sensitive topics that impact public health and well-being. I hope Meta carefully considers the potential risks.
Hmm, this is a tricky issue. While the current system may not be perfect, completely eliminating professional fact-checkers could open the door to more junk science and pseudoscience on these platforms. Careful consideration is needed to find the right balance.
While I understand Meta’s desire to empower users, eliminating professional fact-checkers on critical topics like science and health seems risky. The spread of misinformation could have real-world consequences, undermining public trust and decision-making. A more cautious approach may be warranted.