Spreading Misinformation with Careless Sharing
Social Media Trust
According to a recent Pew Research Center study, more than half of Americans get some, if not all, of their news from social media. With built-in sharing options on all our devices, we are living in a one-click world where reposting has become a way of life.
The internet has been a great democratizer, allowing all to voice their opinions, but it’s also a minefield of misinformation, confabulation and propaganda, which has only become murkier as more and more people are sharing articles on Facebook without even reading them.
Jinping Wang, University of Florida College of Journalism and Communications Advertising assistant professor, and a team of researchers analyzed 35 million Facebook posts between 2017 and 2020 containing links.
The team discovered a troubling pattern. Approximately 75% of the news links shared on Facebook are reposted without the users ever reading the content. This phenomenon, termed “shares without clicks,” questions the quality and veracity of online discourse and the spread of misinformation.
The researchers found that politically extreme content received significantly more shares without clicks than moderate content, with partisan users inclined
to share unread material that aligned with their existing beliefs.
The study revealed distinctive patterns in sharing behavior across the political spectrum. More than 42 billion shares-without-clicks accounted for more than three-quarters of all sharing activity regardless of political affiliations.
There’s also a marked difference between conservative and liberal social sharers. Conservative users demonstrated higher rates of sharing false information (76.9%) compared to liberals (14.3%). However, the researchers emphasize this disparity appears linked to the source material itself, as the majority of false URLs in their dataset (76-82%) originated from conservative news sites.
“The virality of political content on social media appears to be driven by superficial processing of headlines and blurbs rather than systematic processing of the core content,” the researchers write. This finding suggests why misinformation can spread so rapidly — most shared stories circulate without any verification of their underlying facts.
Not surprisingly, the study showed that confirmation bias is rampant in social sharing. Politically aligned content received more shares-without-clicks, indicating users are more likely to share unverified information that confirms their existing beliefs. This behavior contributes to what researchers describe as “ideological segregation” in the online world.
The study’s implications extend beyond partisan politics to fundamental questions about information literacy and social media design. The researchers suggest platforms like Meta could implement interface solutions that encourage more deliberate engagement with news content, such as prompting users to read articles before sharing or providing notices about sharing patterns.
As social media platforms continue to shape public discourse, addressing these sharing behaviors becomes increasingly integral to creating an informed democratic dialogue in these highly partisan times.
The original paper, “Sharing without clicking on news in social media,” was published online in Nature Human Behaviour on Nov. 19, 2024.
Authors: S. Shyam Sundar, Eugene Cho Snyder, Mengqi Liao, Junjun Yin, Jinping Wang and Guangqing Chi.
This summary was written by Gigi Marino.
Posted: January 10, 2025
Insights Categories:
Social Media, Trust
Tagged as: Facebook, Jinping Wang, Misinformation, Social sharing