PRINCETON, N.J. — Dispelling someone’s personal beliefs is always difficult, even if that belief is something that has no basis in science or reality. However, a new study claims to have uncovered the secret to doing away with false beliefs. Researchers at Princeton University say that an individual can be convinced that their belief is wrong by being supplied with facts related to the misinformation.
The researchers came to a number of interesting conclusions after conducting an experimental study on the roots of belief and possible ways to disrupt those beliefs. First, they found that listening to a speaker repeating a belief, based on fact or not, increases the believability of the statement, particularly if the listener already somewhat believes in what is being said. But, if the listener hasn’t committed to particular beliefs surrounding the statement’s subject, hearing correct information can help override myths.
Here’s an example given by the study’s authors: a policymaker wants to dispel the myth which claims “reading in dim light damages children’s eyes.” So, they should repeat the statement, “children who spend less time outdoors are at greater risk to develop nearsightedness.” This way, those on the fence in this debate will be more likely to remember the correct information and less likely to remember the misinformation. However, those with strong beliefs on the topic are unlikely to have their opinions changed in either direction.
“In today’s informational environment, where inaccurate information and beliefs are widespread, policymakers would be well served by learning strategies to prevent the entrenchment of these beliefs at a population level,” says study co-author Alin Coman, assistant professor of psychology at Princeton’s Woodrow Wilson School of Public and International Affairs, in a university release.
The study was not nationally representative, so Coman and his team urge caution but still believe their findings would be similar in a larger study and among a bigger population sample. Researchers believe their work can help shape interventions intended to correct misinformation in particularly vulnerable communities and groups.
First, Coman and his team conducted a primary study with 58 participants, and then a replication study with 88 participants. For the main study, 24 statements were distributed to participants. These statements included eight myths and 16 correct statements spread across four categories: allergies, nutrition, vision, and health.
After grading each statement on a scale of one to seven regarding their belief in each piece of information, the participants listened to a recording of a person remembering some of the beliefs the participants read initially. They were then asked if the person on the recording was remembering the statements correctly. Following this, they were provided with the four categories again, and instructed to remember the original statements they had read in the beginning of the experiment. Finally, they were then provided with the original statements and asked to rate them based on truthfulness and scientific support.
After all this, the results showed that listeners experience changes in their beliefs after listening to information shared by someone else. Specifically, it was noted that the ease with which a belief comes to mind affects its believability. Beliefs that were mentioned in the audio recordings were remembered more often, and believed more vehemently by participants. Conversely, unmentioned beliefs in the recording, even if they were of the same category as mentioned beliefs, were much less likely to be remembered or believed. These effects were observed among both true and false statements.
The study is published in the journal Cognition.