Resource Library
The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.
Know of a great resource that we haven’t included or have questions about the existing resources? Email us!
Disseminate
Collect & Analyze Data
Development Research in Practice : The DIME Analytics Data Handbook Data Management+EconomicsEthicsImpact EvaluationInterdisciplinaryInternational DevelopmentPre-Analysis PlansPre-RegistrationStatistical Literacy
Framework for Open and Reproducible Research Training (FORRT) Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryIssues with transparency and reproducibilityPre-Analysis PlansStatistical LiteracyTransparent Reporting
FORRT is a pedagogical infrastructure designed to recognize and support the teaching and mentoring of open and reproducible science tenets in tandem with prototypical subject matters in higher education. FORRT also advocates for the opening of teaching and mentoring materials as a means to facilitate access, discovery, and learning to those who otherwise would be educationally disenfranchised.
Analysis Plans in Economics EconomicsPre-Analysis Plans
Find slides from a presentation by Benjamin Olken titled “Analysis Plans in Economics”.
Pre-Analysis Plans – Applications in Economics EconomicsPre-Analysis Plans
Find slides from a presentation by Katherine Casey titled “Pre-analysis Plans (PAPs): Applications in Economics”.
Perspectives from Biomedical Research Health Sciences+MedicinePre-Analysis PlansPre-RegistrationTransparent Reporting
Find slides from a presentation by Maya Petersen titled “Pre-Registration, Pre-analysis, and Transparent Reporting: Perspectives from biomedical research”.
Reproducible and Collaborative Statistical Data Science Pre-Analysis Plans
Find slides from a presentation by Philip Stark titled “Reproducible and Collaborative Statistical Data Science”.
How to write a Pre-Analysis Plan Pre-Analysis Plans
Find slides from a presentation by Dalson Figueiredo and Lucas Silva titled “How to write a Pre-Analysis Plan”.
Pre-Analysis Plans (French) Pre-Analysis Plans
Find slides from a presentation by Zachary Tsala Dimbuene titled “Pre-Analysis Plans (French)”.
Pre-Analysis Plans in Behavioral and Experimental Economics EconomicsPre-Analysis Plans
Find slides from a presentation by Johannes Haushoffer titled “Pre-Analysis Plans in Behavioral and Experimental Economics”.
Pre-Analysis Plans for Observational Research Economics+Pre-Analysis Plans
In her presentation at RT2 DC in 2019, Fiona Burlig (University of Chicago) provides advice on how one can credibly pre-register an observational research project. Also see Burlig’s 2018 paper that describes three scenarios for pre-registration of observational work, including i) cases where researchers collect their own data; ii) prospective studies; and iii) research using restricted-access data.
Open Science Module for Behavioral Science graduate course Economics+Psychology
Instructors Kelly Zhang (MIT GOV/LAB) and Chaning Jang (Busara) integrated a module on research transparency and the use of pre-analysis plans as part of the Behavioral Science in the Field course designed for graduate students who use behavioral science games as part of their research.
Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management+InterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesPower analysisPre-Analysis PlansPreprintsRegistriesReplicationsResults-Blind Review & Registered ReportsStatistical LiteracyTransparent ReportingVersion Control
BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.
RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).
Preregistration of secondary data analysis: A template and tutorial Interdisciplinary+Registries
Van den Akker and colleagues present a template specifically designed for the preregistration of secondary data analyses and provide comments and a practical example.
BITSS Registered Reports Literature Review Economics+Pre-Analysis PlansResults-Blind Review & Registered Reports
Prepared by BITSS, this literature review includes information on the distinguishing features and advantages of registered reports, as well as challenges involved in its implementation.
Frontiers in Pre-Registration in Economics – Ted Miguel Economics+Pre-Analysis PlansRegistriesResults-Blind Review & Registered Reports
This presentation by Ted Miguel was given at the Transparency, Reproducibility and Credibility Research Symposium at the World Bank on 9/10/2019. You can find videos of other talks from the Symposium in this playlist.
Observational PAP Guide Economics and Finance+Pre-Analysis Plans
In her preprint titled “Improving transparency in observational social science research: A pre-analysis plan approach”, Fiona Burlig (University of Chicago) presents three scenarios in which study preregistration and pre-analysis plans (PAPs) can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data. The preprint also includes suggested contents for observational PAPs, and highlights where observational PAPs should deviate from those designed for experimental research.
This work was also published in the journal Economics Letters.
Stage 1 Registered Report Submission Template Economics and Finance+Political SciencePre-Analysis PlansResults-Blind Review & Registered ReportsTransparent Reporting
BITSS prepared a template to assist authors in the preparation of their Stage 1 Proposal submissions to the Journal of Development Economics. The template expands on features that are commonly reported in pre-analysis plans in development economics, and includes a checklist to help authors record different parts of the research design.
Course materials: PhD Toolkit on Transparent, Open, and Reproducible Research Economics and Finance+Meta-AnalysesPre-Analysis PlansPublic PolicyRegistriesReplications
Catalyst Ada Gonzalez-Torres developed and delivered a PhD course on Transparent, Open, and Reproducible Research for PhD students at the European University Institute (EUI), in Florence, Italy. Find all course materials here.
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management+Dynamic Documents and Coding PracticesHealth SciencesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPre-Analysis PlansPreprintsPublic PolicyRegistriesReplicationsStatistical LiteracyTransparent ReportingVersion Control
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansReplicationsStatistical Literacy
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Registered Reports at the Journal of Development Economics Economics and Finance+Results-Blind Review & Registered Reports
As part of a pilot project, the Journal of Development Economics (JDE) now offers authors the opportunity to submit empirical research designs for review and approval before the results of the study are known. The pre-results review track is designed to award well-designed and well-executed studies regardless of whether their empirical results yield clear interpretations.
Learn more about the pilot in this blog post by JDE Editors Andrew Foster and Dean Karlan, and BITSS Faculty Director Edward Miguel.
COS Registered Reports information portal Interdisciplinary+Results-Blind Review & Registered Reports
The Center for Open Science (COS) has put together a portal containing information about the registered reports format of peer review and publication. The portal includes general information about registered reports, a list of journals that have implemented the format, an explanation of an appropriate workflow, resources for journal editors, motivation for funders, FAQs, and a list of allied initiatives, inlcuding those that focus on results-blind review and Exploratory Reports.
Mapping the Universe of Registered Reports Interdisciplinary+Results-Blind Review & Registered Reports
A preprint by Tom Hardwicke and John Ioannidis. Abstract: Selection pressures for significant results may infuse bias into the research process. We evaluated the implementation of one innovation designed to mitigate this bias, ‘Registered Reports’, where study protocols are peer-reviewed and granted in-principle acceptance (IPA) for publication before the study has been conducted. As of February 2018, 91 journals had adopted Registered Reports and 91 Final Reports had been published. Psychology journals are the principal adopters, but expansion has begun into medicine, social science, and other fields. Among 29 journals that responded to a survey, 334 protocols had been submitted to them, 87 had been granted IPA and 32 Final Reports had been published or were in press as of July 2017. We encountered several sub-optimal implementation practices, including non-availability of IPA protocols, and diverse approaches to protocol registration in the absence of a single central registry. Registered Reports should be iteratively evaluated and improved to ensure maximal benefits.
Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+Data RepositoriesData VisualizationDynamic Documents and Coding PracticesEconomics and FinanceEngineering and Computer ScienceHealth SciencesHumanitiesInterdisciplinaryIssues with transparency and reproducibilityLife SciencesLinguisticsMeta-AnalysesMetascience (Methods and Archival Science)Open PublishingOther Social SciencesPolitical SciencePower analysisPre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociologyStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control
A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.
AEA Registry for RCTs Economics and Finance+Political SciencePre-Analysis PlansPsychologyPublic PolicyRegistries
The American Economic Association (AEA) Randomized Controlled Trials Registry is a registry for RCTs conducted in the social sciences. Registration is free and you do not need to be a member of the AEA to register. We encourage you to register any new study before data collection.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management+Economics and FinanceInterdisciplinaryIssues with transparency and reproducibilityPolitical SciencePre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociology
Royal Society Open Science Registered Reports Health Sciences+Other Social SciencesPre-Analysis PlansPsychologyReplicationsResults-Blind Review & Registered Reports
The Royal Society Open Science is a fast, open journal publishing high quality research across all of science, engineering and mathematics. A Registered Report (RR) is a form of journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted (stage 1). High quality protocols are then provisionally accepted for publication before data collection commences. The format is open to attempts of replication as well as novel studies. Once the study is completed, the author will finish the article including results and discussion sections (stage 2). This will be appraised by the reviewers, and provided necessary conditions are met, will be published.
J-PAL Hypothesis Registry Pre-Analysis Plans
The Abdul Latif Jameel Poverty Action Lab (J-PAL) hypothesis registry accepted submissions from 2009 to 2013, and was then replaced by the AEA’s registry, socialscienceregistry.org. The hypothesis registry contains 13 examples of pre-analysis plans, primarily from economists doing randomized controlled trials in developing country settings, but also from a large-scale policy natural experiment using the Medicaid program in Oregon. Additional pre-analysis plans from the Oregon experiment are available here.
Nicebread Data Management+Data VisualizationDynamic Documents and Coding PracticesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPower analysisPre-Analysis PlansPreprintsPsychologyRegistriesReplicationsResults-Blind Review & Registered ReportsTransparent ReportingVersion Control
Dr. Felix Schönbrodt’s blog promoting research transparency and open science.
DeclareDesign Dynamic Documents and Coding Practices+InterdisciplinaryPolitical SciencePower analysisPre-Analysis PlansStatistics and Data Science
DeclareDesign is statistical software to aid researchers in characterizing and diagnosing research designs — including experiments, quasi-experiments, and observational studies. DeclareDesign consists of a core package, as well as three companion packages that stand on their own but can also be used to complement the core package: randomizr: Easy-to-use tools for common forms of random assignment and sampling; fabricatr: Tools for fabricating data to enable frontloading analysis decisions in social science research; estimatr: Fast estimators for social science research.
NeuroChambers Issues with transparency and reproducibility+Open PublishingPower analysisPre-Analysis PlansPsychologyReplicationsResults-Blind Review & Registered ReportsTransparent Reporting
Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.
The New Statistics (+OSF Learning Page) Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryMeta-AnalysesOpen PublishingPower analysisPre-Analysis PlansPsychologyReplicationsStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control
This OSF project helps organize resources for teaching the “New Statistics” — an approach that emphasizes asking quantitative questions, focusing on effect sizes, using confidence intervals to express uncertainty about effect sizes, using modern data visualizations, seeking replication, and using meta-analysis as a matter of course.
pcpanel Economics and Finance+Power analysisPre-Analysis PlansStatistics and Data Science
This package performs power calculations for randomized experiments that use panel data. Unlike the existing programs “sampsi” and “power”, this package accommodates arbitrary serial correlation. The program “pc_simulate” performs simulation-based power calculations using a pre-existing dataset (stored in memory), and accommodates cross-sectional, multi-wave panel, difference-in-differences, and ANCOVA designs. The program “pc_dd_analytic” performs analytical power calculations for a difference-in-differences experimental design, applying the formula derived in Burlig, Preonas, and Woerman (2017) that is robust to serial correlation. Users may either input parameters to characterize the assumed variance-covariance structure of the outcome variable, or allow the subprogram “pc_dd_covar” to estimate the variance-covariance structure from pre-existing data.
Transparent and Open Social Science Research Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansRegistriesReplicationsStatistical LiteracyTransparent Reporting
Demand is growing for evidence-based policymaking, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
You can access the course videos for self-paced learning on the BITSS YouTube channel here, (also available with subtitles in French here). You can also enroll for free during curated course runs on the FutureLearn platform.
Manual of Best Practices Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityPre-Analysis PlansTransparent Reporting
Manual of Best Practices, written by Garret Christensen (BITSS), is a working guide to the latest best practices for transparent quantitative social science research. The manual is also available, and occasionally updated on GitHub. For suggestions or feedback, contact garret@berkeley.edu.
EGAP Registry Economics and Finance+Political SciencePre-Analysis PlansPublic PolicyRegistriesSociology
The Evidence in Governance and Politics (EGAP) Registry focuses on designs for experiments and observational studies in governance and politics. The registry allows users to submit an array of information via an online form. Registered studies can be viewed in the form of a pdf on the EGAP site. The EGAP registry is straightforward and emphasizes simplicity for registering impact evaluations.
ClinicalTrials.gov Health Sciences+Pre-Analysis PlansRegistries
ClinicalTrials.gov is a registry and database that provides information on publicly and privately funded clinical trials, maintained by the National Library of Medicine at the National Institutes of Health. Studies are often submitted to the site when they begin and are regularly updated along the way. ClinicalTrials.gov is the largest trial registry, with over 250,000 studies from across the world.
Promise and Perils of Pre-Analysis Plans Economics and Finance+Issues with transparency and reproducibilityPre-Analysis PlansRegistries
Promise and Perils of Pre-analysis Plans, by Ben Olken lays out many of the items to include in a pre-analysis plan, as well as their history, the benefits, and a few potential drawbacks. Pre-analysis plans can be especially useful in reaching agreement about what will be measured and how when a partner or funder has a vested interest in the outcome of a study.
Reshaping Institutions Economics and Finance+Issues with transparency and reproducibilityPolitical SciencePre-Analysis PlansStatistical Literacy
Reshaping Institutions is a paper by Katherine Casey, Rachel Glennerster, and Edward Miguel that uses a pre-analysis plan to analyze the effects of a community driven development program in Sierra Leone. They discuss the contents and benefits of a PAP in detail, and include a “cherry-picking” table that shows the wide flexibility of analysis that is possible without pre-specification. The PAP itself is included in Appendix A in the supplementary materials, available at the link above.
Pre-Analysis Plan Template Economics and Finance+Political SciencePre-Analysis PlansTransparent Reporting
Pre-analysis Plan Template, by Alejandro Ganimian, is useful for instructors when teaching transparency methods, and for researchers themselves when developing their own pre-analysis plan.
Find a .doc version of this template here. Find a .tex version here.
Pre-Analysis Plan Checklist Economics and Finance+Pre-Analysis PlansRegistriesTransparent Reporting
Pre-analysis Plan Checklist, by David McKenzie, Lead Economist at the World Bank Development Research Group.
Experimental Lab Standard Operating Procedures Data Management+Meta-AnalysesPolitical SciencePre-Analysis PlansReplicationsTransparent Reporting
This standard operating procedure (SOP) document describes the default practices of the experimental research group led by Donald P. Green at Columbia University. These defaults apply to analytic decisions that have not been made explicit in pre-analysis plans (PAPs). They are not meant to override decisions that are laid out in PAPs. The contents of our lab’s SOP available for public use. We welcome others to copy or adapt it to suit their research purposes.