Listen to the article
How Users Identify Health Misinformation in Short Videos: Study Reveals Three Key Pathways
Short-video platforms have emerged as critical channels for public health information, with over 1.07 billion Chinese users alone accessing health content through services like TikTok, YouTube Shorts, and Douyin. Yet these platforms’ low barriers to content creation and the rise of AI-generated material have accelerated the spread of health misinformation, posing significant risks to public health.
A groundbreaking study from researchers at Beijing University of Chinese Medicine has uncovered how users identify health misinformation in short videos, revealing three distinct cognitive pathways that could inform targeted intervention strategies.
“The visual storytelling style, concise format, and personalized algorithm recommendations on these platforms reduce the cognitive effort needed to understand complex medical terminology,” says lead researcher Xing Zhai. “But this accessibility comes with risks, as inaccurate health claims can spread at unprecedented speed and scale.”
The research team employed a mixed-methods approach, combining in-depth interviews with 47 participants, questionnaire data from 279 respondents, and advanced statistical modeling to identify key factors affecting misinformation detection.
Three Dimensions of Misinformation Detection
The findings revealed that users’ ability to identify misleading health information is influenced by three key dimensions: information quality, user characteristics, and external environment.
Information quality emerged as particularly important, with content logic and information structure significantly enhancing users’ ability to spot misinformation. Interestingly, professionally polished narrative styles were found to potentially reduce vigilance, making users more susceptible to well-packaged but inaccurate information.
“Users evaluate the authenticity of health information primarily by assessing logical coherence,” explains researcher Rui Wang. “But highly stylized, professional-sounding presentations can actually lower users’ guard, making it harder for them to discern misinformation.”
Within user characteristics, cognitive level played a crucial role, with higher analytical capabilities directly correlating with better discernment. External factors such as likes, comments, and shares also significantly influenced judgment, highlighting the social nature of information evaluation on these platforms.
Three Distinct User Pathways
Perhaps the study’s most significant finding was the identification of three distinct pathways users follow when evaluating health content:
-
The Analytical Central Route: Used primarily by high-cognition users who engage in deep analysis, focusing on content logic, narrative expression, and information structure.
-
The Content-assisted Peripheral Route: Typical of low-cognition users who depend on superficial cues such as professional language or engagement metrics rather than evaluating the content’s logical coherence.
-
The Cognition-assisted Peripheral Route: Employed by medium-cognition users who possess some analytical capacity but still favor peripheral cues like presentation style and social validation.
“These results reveal the diversity of user information processing modes,” says researcher Huanqing Yang. “Different users follow fundamentally different paths when determining what health information to trust.”
Implications for Misinformation Governance
Based on these findings, the researchers developed a multi-level governance framework addressing health misinformation at micro (user), meso (platform), and macro (policy) levels.
At the user level, the study recommends differentiated approaches: providing powerful analytical tools for high-cognition users, personalized education for medium-cognition users, and simplified credibility indicators (like color-coded labels) for low-cognition users.
For platforms, the researchers suggest implementing advanced algorithms to detect logical inconsistencies and evidence issues in health content, while optimizing comment sections to prioritize rational discussion and expert contributions.
At the policy level, the study calls for explicit industry standards for health-related short videos, stronger accountability mechanisms, and collaborative monitoring systems involving health authorities, regulators, industry associations, and platforms.
“Effective governance must move beyond single-dimensional limitations and establish a multi-dimensional collaborative model,” emphasizes Yiyi Wang, a co-author of the study. “We need coordinated efforts at all levels to address this growing challenge.”
The research comes at a critical time as short-video platforms continue to gain prominence as health information sources, with the COVID-19 pandemic having highlighted both the potential and risks of digital health communication.
While the study focused on the Chinese context, its frameworks and findings offer valuable insights for addressing health misinformation globally, especially in regions where short-form video has become a dominant medium for information consumption.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


17 Comments
This is an important study, given the outsized influence of short-video platforms on public health discourse. The speed at which misinformation can spread is concerning. Glad to see researchers taking a multifaceted approach to understanding user behavior and decision-making.
A thoughtful analysis of a complex issue. The accessibility of short-form video is a double-edged sword when it comes to public health information. Identifying the cognitive biases and decision-making processes that lead users to accept misinformation is an important first step.
Absolutely. Understanding the psychology behind how people engage with and evaluate health content on these platforms will be key to developing solutions. Curious to learn more about the specific pathways uncovered in the research.
Tackling health misinformation on short-video platforms is a critical challenge. Glad to see researchers taking a nuanced, mixed-methods approach to understand user behavior. The three pathways identified will likely offer valuable insights for designing effective intervention strategies.
As someone who regularly consumes health content on short-video platforms, I’m glad to see this issue being studied in-depth. The personalized algorithms can be a real double-edged sword when it comes to medical information. Curious to learn more about the three pathways identified.
Same here. The ability of these platforms to quickly spread misinformation is concerning, especially on sensitive health topics. Looking forward to seeing how the research can inform solutions to empower users to think critically about the content they engage with.
Fascinating study on how users navigate health misinformation in short-form video. The accessibility of these platforms is a double-edged sword – they make complex topics more digestible, but also enable rapid spread of inaccurate claims. Targeted interventions will be key to combating this issue.
Agreed. The researchers’ mixed-methods approach to uncovering the cognitive pathways is an important step in developing effective strategies. Equipping users with the right tools to identify misinformation will be crucial.
As someone who regularly consumes health content on short-video platforms, I’m really interested in this study. The personalized algorithms and concise format can make complex topics more accessible, but also enable the spread of inaccurate claims. Glad to see researchers taking a nuanced, mixed-methods approach.
Absolutely. Equipping users with the tools to critically evaluate the health information they encounter on these platforms will be crucial. The three pathways identified could provide valuable insights for designing effective intervention strategies.
This is an important study given the growing influence of short-form video platforms in shaping public health understanding. The accessibility factor is a valid point, but the risks of accelerated misinformation spread are real. Curious to see what the three cognitive pathways are.
Fascinating research into a timely issue. The accessibility of short-form video is a double-edged sword when it comes to health information – it makes complex topics more digestible, but also enables the rapid spread of misinformation. Curious to learn more about the three cognitive pathways identified.
Fascinating research into a timely and critical issue. The accessibility of short-form video is a double-edged sword when it comes to health information. Identifying the cognitive pathways that lead users to accept misinformation is a crucial first step in developing effective interventions.
Agreed. Equipping users with the tools to critically evaluate health content on these platforms will be key. Looking forward to learning more about the specific pathways uncovered and how they can inform targeted strategies.
As someone who regularly consumes health content on short-video platforms, I’m really interested in this research. The personalized algorithms and concise format can make complex topics more accessible, but also enable the rapid spread of misinformation. Curious to learn more about the three pathways identified.
This is an important study given the growing influence of short-form video platforms in shaping public health understanding. Identifying the cognitive biases and decision-making processes that lead users to accept misinformation is a crucial first step. Looking forward to seeing how the research can inform solutions.
Agreed. Empowering users to think critically about the health content they engage with on these platforms will be key. The three pathways uncovered could provide valuable insights for designing effective intervention strategies.