Synthesis produced by the Fondation Descartes of the following research paper:
Pennycook, G., Epstein, Z., Mosleh, M. et al. Shifting attention to accuracy can reduce misinformation online. Nature (2021). https://doi.org/10.1038/s41586-021-03344-2
In this article, a team of psychologists tries to explain why we sometimes share false information on the Internet. The authors first discuss the so-called confusion-based account, which postulates that if individuals sometimes share false information, it is because they in fact believe this information to be true.
To test this hypothesis, Pennycook and his team asked 1,015 individuals living in the United States to evaluate the accuracy of several news stories. Each of these was either objectively true or false and was either pro-democrat or pro-republican (independent of its accuracy). Participants were then asked to indicate whether they would consider sharing each of these stories on the Internet.
The results of this experiment show that, on the whole, participants are able to correctly assess the accuracy of the information presented to them. However, participants’ responses indicate that they are more likely to share news stories that are consistent with their own political beliefs, even if these are inaccurate. In other words, information that is false but consistent with participants’ political orientation has a greater potential for dissemination than accurate information that conflicts with their political views.
This result calls into question the validity of the confusion-based account: participants are indeed able to determine the degree of accuracy of the information presented to them, but the information’s political hue is a more decisive criterion in determining their willingness to share this information on the Internet.
Pennycook and his team then propose a hypothesis to explain this phenomenon: participants, despite their commitment to the accuracy of information, may be distracted by other motivations (political, in this case) when it comes to sharing it. If this hypothesis is correct, then drawing participant’s attention to the accuracy of the information they are about to share could lead to a decrease in the circulation of false information on the Internet.
To test this hypothesis, the authors encouraged participants to focus on the accuracy of information rather than its political hue. To do so, they asked participants at the very beginning of a new study to rate the accuracy of a piece of information that is politically neutral, pretending to want to use this data for an upcoming, fictitious experiment. The study was then conducted in the same manner as before (an evaluation of the accuracy of a set of news stories, and then of willingness to share them on the Internet).
With the addition of this simple initial task, the proportion of false information consistent with participants’ political orientation that they say they are likely to share decreases. In other words, focusing people’s attention on the degree of accuracy of information suffices to reduce their propensity to share false information that is nonetheless in line with their political beliefs.
In order to further verify this experimental result, Pennycook and his collaborators followed a similar procedure with just over 5,000 Twitter users identified as having previously shared false information on this social network. To do so, the authors sent them a private message on Twitter asking them to rate the accuracy of a politically neutral piece of information. The account activity of the Twitter users in question was then monitored. It was found that, in the days following the experiment, these accounts spread significantly less information from unreliable websites.
This is therefore an encouraging result, in that Pennycook and his team offer a solution to curb the spread of false information online that is relatively simple to enact. However, the effectiveness of this procedure still needs to be tested over the long-term and on a larger scale.