Last month, the National Academy of Sciences, in partnership with the Nobel Foundation, held the second Nobel Prize Summit: Truth, Trust, and Hope. The three-day event featured behavioral scientists as well as a performance artist, magician, poet, and Nobel laureates from various scientific disciplines from technology and misinformation to anthropology and cognition. We’ve provided a few highlights and encourage you to see the full program here.
Dr. Elizabeth Loftus, Distinguished Professor at the University of California Irvine, and FABBS IHO recipient, spoke on personal memories and memory paradigms, specifically of misinformation. Her research using “rich false memory procedure” found that a suggestion about a past event that did not happen yielded 30 percent of participants who cultivated false memories, while 23 percent more developed false beliefs.
While suggesting a past event could cause you to falsely remember that you had been lost in a shopping mall at age 5, Dr. Loftus cautions that deception isn’t always necessary for implanting false memories; for example, push-polls can introduce information that affects perception. “Deep-fake” technology also exacerbates this problem; AI-generated photos and speech can provide convincing, though inaccurate or misattributed, images and language.
Dr. Gizem Ceylan, Behavioral Scientist at the Yale University School of Management and the Yale Center for Customer Insights, has found that even when people know information is inaccurate, they continue to disseminate it. Dr. Ceylan examined possibilities outside of cognitive capacities and biases as the culprit, such as the characteristics of social media environments that make individuals more vulnerable to sharing misinformation.
According to Dr. Ceylan, the real problem isn’t the people or their cognitive capability, but rather the reward structure on social platforms that reinforces certain automatic behaviors and habit formations. Her research showed that less habitual social media users also shared less misinformation than habitual users. Interestingly, 96 percent of people want to share accurate information, supporting the theory that social platform habits are the problem. Dr. Ceylan and her co-authors found that a restructuring of rewards, such as positive reinforcement for sharing accurate information, builds habits that help reduce the spread of fake news.
Dr. Rachel Kuo, Assistant Professor at the University of Illinois Urbana-Champaign, spoke to how technology intersects with politics and social movements in Fissures & Fractures: Tracing the Fault Lines of Misinformation; that its origin is not a new problem, but rather an extrapolation upon histories of inequality.
Dr. Kuo examined ways that misinformation and disinformation have impacted each of us both personally and transnationally through a connective triad of information, belief, and action. Through examining the origin of misinformation, Dr. Kuo iterated there is a tangible hope that through intentional intervention, we can change either the problematic distributions or consumer content and, in turn, alter beliefs and actions, potentially resulting in larger-scale changes.
Information has always been enveloped in socio-economic and political structures; it is not the problem of individual responsibility, but rather of intersecting structural harms. These structures have often reinforced the systemic disenfranchisement of minority groups and promoted racist and classist ideologies, which must continue to be dismantled.
Dr. Kendall-Taylor, CEO of the FrameWorks Institute, examined cultural mindsets and the relationship between scientific trust and misinformation. He unpacked the assumptions that science contains a hidden agenda in order to display its social value. In turn, the scientific community must make efforts to communicate effectively to facilitate trustful exchanges.
It is exciting to witness behavioral and social scientists being featured in this high profile, international event. These scientists are leading ground-breaking research on integral issues such as diagnosing key social problems, e.g., pandemic responses, voter apathy, and electoral outcomes, and vocalizing the need to analyze power and cultivate shared definitions to eliminate ambiguity when combatting misinformation in the quest for truth.
Coalition for Trust in Health and Science, contact the Coalition