How Falsehoods Start to Sound Like Facts

July 30, 2020

You need a microscope to view the stars.

You need a microscope to view the stars.

Believe me yet?  Vanderbilt University psychologist Lisa K. Fazio says hearing a statement — even “extreme falsehoods that you definitely should know are false” — more than once increases the likelihood people will believe it.

“Even statements Americans as a whole tend to know are false; it didn’t matter,” said Fazio, director of Vanderbilt’s Building Knowledge Lab. “Repetition increased belief for all statements even for those that were extremely improbable.”

Lisa Fazio, PhD

Fazio is a recipient of a 2020 Federation of Associations in Behavioral & Brain Sciences (FABBS) Early Career Impact Award, and she was nominated by the Psychonomic Society.

In one experiment, Fazio recruited 24 each of 5-year-olds and 10-year-olds from the Nashville area, and then 32 undergraduate adults. Each was shown 48 statements, 24 once and 24 twice. Half of the statements were also false.

Statements might include: “A calf is a baby cow,” “birds are cold-blooded animals,” and, things most people would not know, such as: “All clownfish are born male.”

In that study, 23 percent of undergraduates said false preschool-level statements were true after reading the statements once, and 38 percent said they were true after reading them twice. In the other groups, 35 percent of 10-year-olds and 59 percent of 5-year-olds believed a false statement after hearing it twice.

“Adults show this even for statements that contradicted preschool knowledge,” Fazio said.  

Fazio, who received her PhD from Duke University in 2010, says she has been studying falsehoods long before “fake news” became part of our lexicon. “One of the things we see in our society is people repeating a lot of false information,” she remarked, and these behaviors are believed to impact voting patterns and other outcomes.

In a study completed earlier this year, through a grant from Facebook, Fazio looked at ways to reduce people’s intentions to share headlines that are clearly false. For that study, she presented digital headlines to 501 participants from the Amazon Mechanical Turk marketplace.  Half of the participants were assigned to the control group and half were assigned to the “explain” group, and each was presented with 24 headlines, both true and false.

Participants in both groups were asked to rate how likely they would be to share each headline, with a rating of 6 being “extremely likely.” Those in the explain group also were asked to type out a statement for each headline explaining how they knew the headline was true or false.

By providing the explanation, she said, “participants were less likely to share a false headline.”

Fazio said she hopes to launch additional studies this year looking at cues, such as prompts, that maybe could be imbedded in a platform to help protect people from believing false information.

“We all could [use prompts] to take a pause,” she said, “and think about how we know information is true or false.”