2015 Impact Award Winners

How to Debunk a Scientific Myth

July 21, 2015
by Suzanne Bouffard

Misconceptions about science can be dangerous, like the inaccurate belief that childhood vaccines cause autism. That myth persists even though it has been thoroughly debunked by scientific studies and attacked in national media campaigns. But research by psychologist Panayiota (Pani) Kendeou suggests that carefully crafted messages can change people’s minds and protect public health. Kendeou, an educational psychologist at the University of Minnesota, has brought together research on reading, cognition, and neuroscience in the Knowledge Revision Components Framework (KReC), which explains how people read and incorporate new information designed to correct inaccurate beliefs. She developed the framework because “even though we knew a lot about the impact of misconceptions on memory and learning, we had very little understanding of the mechanisms behind them.” She wanted to know why people held onto inaccurate beliefs and how to change them.

Simply telling people they are wrong doesn’t change their minds. Indeed, a study of a public awareness campaign telling people that vaccines don’t cause autism backfired and made some parents less likely to vaccinate. Kendeou’s research has some clues about why. Her studies show that people need a causal explanation of why a previously-held belief is inaccurate, or of why the competing factual information is true. When people read such an explanation, they are more likely to believe and retain the correct information and to remember it a month later.

One of Kendeou’s studies compared multiple strategies for overcoming ten common false beliefs, including the myth that humans use only 10% of their brains. The most effective was a passage explaining that the inaccurate belief may have started with the famous psychologist William James who used the statement only metaphorically and is inadvertently reinforced when the media share fMRI pictures of certain brain regions ‘lighting up’ when people engage in certain activities. The passage then went on to explain the correct information that humans use well over 10% of their brains and provide related evidence. Study participants who heard both parts of the explanation were more likely to correctly answer a question a month later about human brain use, as compared to participants who read only a statement refuting the false belief.

In this and similar studies, the causal explanation appeared to be the most important part, because participants were more likely to get the right answer even when they heard the explanation without a refutation of the inaccurate belief. The causal explanation helps to “build a highly interconnected competing network in the brain that wins over the network of the incorrect belief,” Kendeou elaborates. That doesn’t mean the refutation is never necessary. Kendeou hypothesizes that some beliefs require a refutation whereas others don’t. She and her colleagues are now conducting studies that look at whether knowledge revision works differently depending on how committed to and emotionally invested people are in their inaccurate beliefs. Her hunch is that when people are very attached to an idea, they will be more likely to change their minds without the refutation statement, because the refutation could cause them to be defensive and immediately reject the correct information.

Those findings could have implications for campaigns encouraging parents to vaccinate their children. Kendeou isn’t surprised that current campaigns aren’t working, because they typically don’t include information about why vaccines don’t cause autism or about what does. (There is no singular cause identified but there are many risk factors.) But she believes it is still unclear whether campaigns should include a refutation of the autism myth.

There are many other applications for Kendeou’s work. She is most interested in those that “influence decision-making and human behavior on issues related to public health and climate change.” She and her collaborators are also beginning to develop technology-based learning environments for middle and high school students to correct misconceptions about science, which, she says, can compound over time and make it difficult for students to master these subjects. So far, all of her research has focused on revising misconceptions with reading text, but could the same principles apply to other media like watching television? “I have no reason to believe the KReC framework principles would not generalize,” she says.

Panayiota (Pani) Kendeou was recently honored with the Federation of Associations in Behavioral & Brain Sciences (FABBS) Foundation Early Career Impact Award during the annual meeting of the Society for Text & Discourse in July 2015, in Minneapolis, Minnesota.