Algorithms and Health Misinformation: A Case Study of Vaccine Books on Amazon
Communication and Technology Health and Science Trust
Many technology companies like Amazon and Facebook use algorithms to determine what content should be presented to users based on their previous behavior and preferences. Search algorithms are also a key determinant of what information is presented. Platform search results are one of the main ways people retrieve and believe information.
But how algorithms work remain a mystery to the general population. And the technology can create a dangerous feedback loop of reinforcing misinformation: the more you click on a subject, the more recommendations for that subject occur, thus creating the illusion that the more you see something, the more it must be true.
To understand how Amazon recommendations could spread health misinformation, University of Florida College of Journalism and Communications Telecommunication Assistant Professor and Trust Scholar Jieun Shin and Thomas Valente from the USC Keck School of Medicine, examined Amazon’s rankings and recommendations of books based off the search word “vaccine.” According to The World Health Organization, vaccine-hesitancy is one of the major threats facing the world. Exposure to negative information regarding vaccines can affect the public’s opinion of vaccines.
The authors chose to examine books due to research that shows that “books are considered to be a voice of experts.” For seven days, researchers pulled the first 10 pages of Amazon’s “vaccine” search results and then coded the results into three categories: vaccine-supportive vaccine-hesitant, and unclear. For each book that appeared on the first 10 pages during seven days, they also content coded other recommended books displayed under the “customers who bought this item also bought” prompt.
Of the books analyzed, 62 per cent were vaccine-hesitant. The three top-ranked books across the week and the three most-frequently recommended books were all vaccine-hesitant ones. As a result, the majority of information delivered to a user who is anti-vaccination will likely reinforce their beliefs. In addition, a user without an opinion on vaccines may conclude that this information is the most valid since it is the most prevalent.
While Amazon and other platform companies claim to stay neutral in their algorithms, the research suggests that pre-programmed algorithms may unintentionally channel users’ exposure into opinions that are not supported by the science and medical community. This may create the illusion for users that a misinformed minority view is accepted widely in the public.
The researchers believe it is essential to raise public awareness about algorithmic filtering and to have a thorough discussion about the role of algorithms in disseminating public health issues.
Authors: Jieun Shin and Thomas Valente
The original article, “Algorithms and Health Misinformation: A Case Study of Vaccine Books on Amazon,” was published in the Journal of Health Communication on June 14, 2020.
This summary was written by Alexandra Avelino, UFCJC M.A.M.C. 2020, Student Affairs Program Coordinator at the UF College of Veterinary Medicine.
Posted: August 5, 2020
Insights Categories:
Communication and Technology, Health and Science, Trust
Tagged as: Amazon, Jieun Shin, Misinformation, Vaccine