2017 Annual Meeting Showcases Innovations for Transparent and Reproducible Science

By: Kelsey Mulcahy, BITSS Program Manager


Recipients of the 2017 Leamer-Rosenthal Prizes for Open Social Science with Ed Leamer (left) and Ted Miguel (right). Photo credit: Dustin Marshall

The 2017 Annual Meeting of the Berkeley Initiative for Transparency in the Social Sciences (BITSS) moved beyond the credibility crisis in social science research to discuss potential solutions. The two-day event featured a workshop, two panel discussions, an award ceremony for the 2017 Leamer-Rosenthal Prize Recipients, and six research sessions. Our largest gathering to date drew over 100 in-person participants and more than 600 remote participants from across the social sciences, along with a wide variety of perspectives.

We kicked off the event with an engaging keynote discussion on the strength of evidence and thresholds for statistical significance led by three Leamer-Rosenthal Prize Recipients: Dr. Daniël Lakens, Dr. Simine Vazire, and Dr. Eric-Jan Wagenmakers. This discussion has garnered widespread interest from the scientific community over the past few months, as high profile scholars have weighed in on whether the threshold for statistical significance should be lowered from 0.05 to 0.005, or transparently reported and justified (check out the preprint co-authored by Vazire, Wagenmakers and others and the response by Lakens and co-authors… as well as a great Vox summary article by Brian Resnick here).

While there are strong arguments on both sides, our main takeaway is that we should all be more mindful of what constitutes evidence. Statistical significance alone, whether at the .05, .005, or even .00001 thresholds, may not be enough to draw conclusions when results are not triangulated with strong theory, cannot be replicated, or cannot stand up to other means of testing “strength.”

A subsequent panel highlighted a new area for the BITSS community: how to work with high impact social science institutions to fundamentally change norms and practices at scale. Presentations from the Inter-American Development Bank (IDB), International Initiative for Impact Evaluation (3ie), and the National Institute of Public Health in Mexico (INSP) all highlighted the need for internal policies to govern how research is managed and published. BITSS can support the development of such policies by addressing common challenges in data management and publication, and facilitating knowledge sharing across institutions. Ultimately, we hope to see a movement toward international standards in how such research should be governed.

This provided a natural segue into the solutions-focused research sessions on Day 2. Highlights include:

  • Internal replication to reduce errors. Jade Benjamin-Chung presented tips and recommended practices for internal replication—a process in which multiple researchers independently complete data cleaning and analysis steps to reduce unintended errors—based off her work on the WASH Benefits trials.
  • Improving study design. Graeme Blair[i] outlined progress and future plans for DeclareDesign, software that allows researchers to assess the strengths and limitations of their design at the start of a project and before data collection, make improvements to the design, and easily share the design with other researchers. An R package will be released in early 2018.
  • Dynamic and Interactive documents for policy analysis. Fernando Hoces de la Guardia made the case for more transparent and reproducible policy analysis, including the need for transparent assessment and documentation of inputs and assumptions in models used for policy reports. Software and tools utilized by social science researchers for reproducibility (such as dynamic documents, interactive platforms, etc.) could also be leveraged by policy research organizations.
  • Metrics for multi-site replications. Maya Mathur proposed new metrics for multisite replication projects that are more intuitive to interpret and leverage the strengths of many-to-one site designs. An R package is available called “Replicate”.
  • Data sharing agreements for meta-analysis. Josh Polanin discussed how data-sharing agreements could increase researchers’ willingness to share individual participant data in the context of meta-analysis.[ii]
  • Dynamic meta-analysis. Sho Tsuji presented MetaLab, an interface and central repository developed to facilitate dynamic meta-analyses that is powered by open source tools and crowd-sourced information. The platform was originally developed for the early childhood language development literature, but could be expanded to other disciplines in the future.[iii]

Since our establishment in 2012, BITSS has made great strides towards empowering the next generation of researchers with the tools they need to make their research more transparent and credible. We have seen an exciting shift in the conversation from a focus on the credibility crisis in social science research to the identification and advancement of innovative solutions. We will continue to work with key players across the research ecosystem—from producers and funders to publishers and users—to generate evidence regarding which tools and methods are most effective in improving practices and norms. We look forward to incorporating best practices into ongoing scale-up efforts.

 

Materials:

Full agenda

OSF Page with all presentations

Video livestreams

[i] Graeme Blair is a 2016 Leamer-Rosenthal Recipient in the Emerging Researchers category and an affiliate of the Center for Effective Global Action.

[ii] This project was supported by a Social Science Meta-Analysis and Research Transparency (SSMART) grant.

[iii] This project was supported by a Social Science Meta-Analysis and Research Transparency (SSMART) grant.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.