October 9, 2019
On September 24, the National Academies of Science brought together leaders of federal science agencies, researchers, and representatives from scientific societies and foundations for the Reproducibility and Replicability in Science: Next Steps Symposium.
The report and convening are a response to a request from Congress to assess current reproducibility and replicability across fields of science, offer recommendations for improving rigor and transparency in scientific research, and help build public trust and confidence in the reliability of science. The report defines reproducibility as obtaining consistent computational results using the same input data, computational steps, methods, code, and conditions of analysis, whereas replicability means obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data. The report also examines circumstances that may interfere with reproducibility and replicability in research.
Brian Nosek, FABBS board member, participated on the panel about computational reproducibility. “While there has been a lot of attention to reproducibility and replication specifically in the field of psychology, psychology is also the field most proactively and aggressively working to examine and guaranty the rigor of our work.”
Members of the National Academy of Sciences decadal survey committee and representatives from the federal government and research institutions met on September 30 to discuss findings and recommendations from the consensus study report: A Decadal Survey of the Social and Behavioral Sciences: A Research Agenda for Advancing Intelligence Analysis. Among the discussants was former FABBS president, Susan T. Fiske.
Released in March 2019 by the Board on Behavioral, Cognitive, and Sensory Sciences, the report recommends that the intelligence community (IC) commit to collaborating with researchers in the social and behavioral sciences (SBS) as it develops research objectives for the coming decade.
The report identifies key opportunities in SBS research for strengthening intelligence analysis and offers ideas for integrating the knowledge and perspectives of SBS researchers into the planning and design of efforts to support intelligence analysis. Examples of potential contributions of SBS include better understanding of human-machine interaction, stronger intelligence assessments, and insights on human behavior, capacities, and limitations.
Discussant Steven Breckler, a Program Director at NSF, suggested that the IC engage behavioral scientists early on in the research process. As an example, Breckler mentioned the creation of hiring logarithms that ended up showing the very same biases as humans. He suggested that had social psychologists been involved in the beginning, they would have thought to account for this bias. Instead, these scientists were brought in after the fact to correct a problem that they may have prevented.
The decadal survey model was developed by NAS to serve other federal agencies by surveying research relevant to key policy objectives. This is the first time that the model has been used to survey SBS fields.