Listen to the article
Power-Seekers More Likely to Share Fake News, UCL Study Reveals
People motivated by power and the desire to influence others are more prone to sharing fake news on social media platforms, according to groundbreaking research led by University College London.
The study, published in the journal Computers in Human Behavior, reveals a concerning link between power motivation and the spread of misinformation online, a phenomenon that has increasingly plagued social media landscapes worldwide.
Researchers conducted four separate experiments involving 1,882 participants who were presented with a mix of real and fake social media posts. Participants indicated which posts they would be inclined to share on social media while completing assessments measuring their power values, personality traits related to dominance, and their desire to influence others through social media activity.
The results showed a clear pattern: individuals motivated by power were significantly more likely to share fake news content, yet showed no increased tendency to share legitimate news. This suggests a selective sharing behavior that favors misinformation when it serves personal influence goals.
“Our findings suggest that people who are motivated by obtaining power, and influencing others, may share misinformation without concern for its accuracy, as a form of brokerage to gain a following and to control narratives,” explained lead author Professor Ana Guinote from UCL’s Psychology and Language Sciences department.
Notably, the research distinguished between holding power in real life versus being motivated by it. Whether participants held positions of power in their professional lives did not predict their likelihood of sharing misinformation. Instead, the psychological drive for power and dominance emerged as the critical factor.
Participants who scored highly on dominance measures not only shared more fake news during the experiment but also self-reported having knowingly shared misinformation in the recent past. This potentially indicates awareness of the unreliable information they were distributing but a willingness to proceed regardless.
The findings have significant implications as social media platforms continue to grapple with the rapid spread of misinformation. In recent years, fake news has become a growing concern for democratic processes, public health communication, and social cohesion globally. Major platforms including Facebook, Twitter, and YouTube have implemented various measures to combat misinformation, though with mixed results.
Professor Guinote further noted: “Other studies have suggested that people are more likely to share misinformation if it is consistent with their beliefs and goals, such as political goals in the lead-up to an election, so it may be that those who are motivated by political gain and influencing an election might be particularly likely to spread misinformation.”
This research adds a crucial psychological dimension to our understanding of how misinformation spreads online. While previous studies have focused on factors like political polarization, confirmation bias, and digital literacy, the UCL study highlights how personal motivations for power and influence can drive misinformation sharing behavior.
The implications extend beyond individual psychology to broader concerns about information integrity in the digital age. As social media platforms evolve, understanding the motivations behind misinformation sharing becomes essential for developing effective interventions.
Social media companies may need to consider how their platforms inadvertently reward power-seeking behaviors through metrics like shares, likes, and follower counts. Meanwhile, digital literacy programs might benefit from addressing not just how to identify fake news, but also the psychological drivers that make sharing it appealing.
As elections approach in numerous countries worldwide, these findings take on additional significance, suggesting that power-motivated individuals might be particularly susceptible to spreading political misinformation during critical democratic processes.
The UCL research represents an important step in the multidisciplinary effort to combat online misinformation by illuminating one of the key psychological mechanisms that fuels its spread.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


21 Comments
Interesting study. It highlights how social media can be weaponized by those seeking power and control, rather than truth and transparency. Tackling misinformation is critical for the health of online discourse.
Absolutely. Platforms must do more to identify and limit the spread of misinformation, especially from bad actors with ulterior motives.
This is yet another example of how social media can be abused by those prioritizing personal influence over facts. Responsible platform design and user education are essential to combat this growing problem.
Well said. Misinformation erodes public trust and undermines healthy democratic discourse. Addressing the root causes is crucial.
The power-driven spread of misinformation is a concerning trend. While free expression is important, platforms need to strike a better balance between that and curbing the intentional dissemination of falsehoods.
Agreed. Misinformation can have serious real-world consequences, so it’s critical that platforms and policymakers find effective ways to address this issue.
The link between power motivation and the spread of misinformation is a concerning finding. It underscores how social media can be weaponized by those prioritizing personal influence over facts. Addressing this challenge requires a multifaceted strategy.
Absolutely. Platforms, policymakers, and users all have a role to play in promoting a healthier information ecosystem online.
Agreed. Misinformation can have serious real-world consequences, so it’s critical that we find effective ways to combat this issue.
This is concerning, but not surprising. Power-hungry individuals often prioritize their own influence over truth and accuracy. Social media gives them an easy platform to spread misinformation for personal gain.
It’s a challenging problem without easy solutions. Educating users to be more discerning consumers of online content is key.
Agreed. Platforms need stronger safeguards to curb this type of manipulative behavior.
This study highlights a troubling dynamic where a desire for power and influence leads some to prioritize the spread of misinformation over truthful content on social media. Tackling this problem will require a concerted effort from platforms, policymakers, and users.
Well said. Misinformation erodes public trust and undermines healthy democratic discourse, so addressing the root causes is crucial.
Agreed. Responsible platform design and user education are essential to combat this growing problem.
The findings of this study are quite alarming. When people’s thirst for power and influence overrides their concern for truth and accuracy, it can have corrosive effects on public discourse. This is a challenge that must be addressed.
Agreed. Misinformation thrives when the incentive structures of social media reward engagement over veracity. Rethinking these systems is crucial.
This research highlights a troubling dynamic where a desire for power and influence leads some to prioritize the spread of misinformation over truthful content. Tackling this problem requires a multifaceted approach.
Absolutely. Platforms, policymakers, and users all have a role to play in promoting a healthier online information ecosystem.
This study is a sobering reminder of how social media can be exploited by those seeking power and control, rather than truth and transparency. Combating misinformation requires a holistic approach involving platforms, policymakers, and users.
Well said. Empowering users to be more discerning consumers of online content is key, but platforms must also take stronger action to limit the spread of falsehoods.