Resource Library
The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.
Know of a great resource that we haven’t included or have questions about the existing resources? Email us!
Disseminate
Collect & Analyze Data
Pre-Analysis Plans in Behavioral and Experimental Economics EconomicsPre-Analysis Plans
Open Science Success Stories Data Management and De-identification+Issues with transparency and reproducibility

The Open Research Funders Group curates the Open Science Success Stories, a database of examples of how openness has benefited researchers and broader society.
Data Citations module Data Management and De-identification+InterdisciplinaryTransparent Reporting

Created by the Federal Reserve Bank of St. Louis, this module introduces students to the key elements of data citations. See also related modules for Data Literacy.
Handbook on Using Administrative Data for Research and Evidence-Based Policy Data Management and De-identification+EconomicsInterdisciplinaryInternational DevelopmentReproducibility

Co-edited by Shawn Cole, Iqbal Dhaliwal, Anja Sautmann, and Lars Vilhuber and published by J-PAL’s Innovations in Data and Experiments for Action Initiative (IDEA), this handbook includes case studies of large-scale randomized evaluations using private and national government administrative data, and technical guidance to support partnerships with governments, nonprofits, or firms to access data and pursue cutting-edge, policy-relevant projects.
Survey of Registered Reports Editors Interdisciplinary+Results-Blind Review & Registered Reports
Between December 15, 2017 and January 31, 2018, BITSS surveyed the editors of 76 academic journals which at the time, accepted submissions in the Registered Report (RR) format. Find summary statistics of the results in this document.
CRediT (Contributor Roles Taxonomy) InterdisciplinaryTransparent Reporting

CRediT (Contributor Roles Taxonomy) is high-level taxonomy, including 14 roles, that can be used to represent the roles typically played by contributors to scientific scholarly output. The roles describe each contributor’s specific contribution to the scholarly output.
Comparison of multiple hypothesis testing commands in Stata Economics+Statistics and Data Science
In this post on the Development Impact blog, David McKenzie (World Bank) compares various Stata packages used for multiple hypothesis testing adjustments and discusses settings where each package is best applied.
Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Educational Expansion Epidemiology+Statistical LiteracyTransparent Reporting

Created by Catalyst Melissa Sharp, this is an open-source repository for epidemiological research methods and reporting skills for observational studies, structured based on the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement. Use it to discover new methods and reporting guidelines and contribute through the GitHub repository (https://github.com/sharpmel/STROBECourse/).
Pre-Analysis Plans for Observational Research Economics+Pre-Analysis Plans
In her presentation at RT2 DC in 2019, Fiona Burlig (University of Chicago) provides advice on how one can credibly pre-register an observational research project. Also see Burlig’s 2018 paper that describes three scenarios for pre-registration of observational work, including i) cases where researchers collect their own data; ii) prospective studies; and iii) research using restricted-access data.
Data for Development Impact (Resource Guide) Data Management and De-identification+EconomicsOther Social SciencesStatistics and Data Science

“Data for Development Impact: The DIME Analytics Resource Guide” is intended to serve as an introduction to the primary tasks required in development research, from experimental design to data collection to data analysis to publication. It serves as a companion to the DIME Wiki and is produced by DIME Analytics.
Open Science Module for Behavioral Science graduate course Economics+Psychology
Instructors Kelly Zhang (MIT GOV/LAB) and Chaning Jang (Busara) integrated a module on research transparency and the use of pre-analysis plans as part of the Behavioral Science in the Field course designed for graduate students who use behavioral science games as part of their research.
J-PAL Guide to De-Identifying Data Data Management and De-identification+International Development
Developed by J-PAL’s Sarah Kooper, Anja Sautmann, and James Turrito, this guide includes:
- An overview of personally identifiable information (PII) and the responsibility of data users not to use data to try to identify human subjects
- Recommendations for handling direct identifiers (such as full name, social security number, or phone number), as well as indirect identifiers (such as month/year of birth, nationality, or gender)
- Guidance on de-identification steps to take throughout the research process, such as encrypting all data containing identifying information as soon as possible
- A list of common identifiers, including those labeled by the United States’ Health Insurance Portability and Accountability Act (HIPAA) guidelines as direct identifiers
- And more.
See also the accompanying Guide to Publishing Research Data.
J-PAL Guide to Publishing Research Data Data Management and De-identification+International DevelopmentPublic Policy

Developed by J-PAL’s Sarah Kooper, Anja Sautmann, and James Turrito, this guide includes:
- A list of considerations to make before publishing data, such as what information was provided to study participants and the IRB, the sensitivity of the data collected, and legal requirements
- Sample consent form language that will allow future publication of de-identified data
- A checklist for preparing data for publication
- And more.
See also the accompanying Guide to De-identifying Data.
Data Sharing Checklist for NGOs and Practitioners Data Management and De-identification+Interdisciplinary

This checklist developed by Teamscope can help NGOs and Practitioners understand the common pitfalls in open data, and how open data impacts every step of a project’s pipeline, from proposal writing to dissemination.
Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management and De-identification+InterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesPower analysisPre-Analysis PlansPreprintsRegistriesReplicationsResults-Blind Review & Registered ReportsStatistical LiteracyTransparent ReportingVersion Control

BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.
RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).
Preregistration of secondary data analysis: A template and tutorial Interdisciplinary+Registries
Van den Akker and colleagues present a template specifically designed for the preregistration of secondary data analyses and provide comments and a practical example.
Replicability Seminar Issues with transparency and reproducibility+Statistical Literacy
Course syllabus for “Replicability Seminar”, an advanced undergraduate and graduate-level course led by Simine Vazire.
Open Data Metrics: Lighting the Fire Data Management and De-identification+Interdisciplinary

In this book, Daniella Lowenberg and colleagues describe the journey towards open data metrics, prompting community discussion and providing implementation examples along the way. Data metrics are a pre-condition to realize the benefits of open data sharing practices.
BITSS Registered Reports Literature Review Economics+Pre-Analysis PlansResults-Blind Review & Registered Reports
Prepared by BITSS, this literature review includes information on the distinguishing features and advantages of registered reports, as well as challenges involved in its implementation.
Nextjournal Dynamic Documents and Coding Practices+Version Control

Nextjournal is a container tool with features like polyglot notebooks, automatic versioning and real-time collaboration.
Frontiers in Pre-Registration in Economics – Ted Miguel Economics+Pre-Analysis PlansRegistriesResults-Blind Review & Registered Reports
This presentation by Ted Miguel was given at the Transparency, Reproducibility and Credibility Research Symposium at the World Bank on 9/10/2019. You can find videos of other talks from the Symposium in this playlist.
BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility
BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Software Carpentry Data Management and De-identification+Dynamic Documents and Coding PracticesEngineering and Computer ScienceInterdisciplinaryStatistics and Data ScienceVersion Control
Software Carpentry offers online tutorials for data analysis including Version Control with Git, Using Databases and SQL, Programming with Python, Programming with R and Programming with MATLAB.
Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices+Issues with transparency and reproducibility
Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
Data Carpentry Lessons Data Management and De-identification+Interdisciplinary
Developed by Data Carpentry, these lessons can be used across the social sciences to teach data cleaning, management, analysis, and visualization. R is the base language for instruction, and there are no pre-requisites in terms of prior knowledge about this topic.
Observational PAP Guide Economics and Finance+Pre-Analysis Plans
In her preprint titled “Improving transparency in observational social science research: A pre-analysis plan approach”, Fiona Burlig (University of Chicago) presents three scenarios in which study preregistration and pre-analysis plans (PAPs) can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data. The preprint also includes suggested contents for observational PAPs, and highlights where observational PAPs should deviate from those designed for experimental research.
This work was also published in the journal Economics Letters.
Web Plot Digitizer Data Management and De-identification+InterdisciplinaryMetascience (Methods and Archival Science)Statistics and Data Science
App extracts data from charts
ResonsibleData.io Data Management and De-identification+Dynamic Documents and Coding PracticesInterdisciplinaryMetascience (Methods and Archival Science)Statistics and Data Science
Using data for social change work offers many opportunities, but it brings challenges, too. The RD community develops practical ways to deal with the unintended consequences of using data in social change work, establishes best practices, and shares approaches between leading thinkers and doers from different sectors. We discuss thorny topics in-person, facilitate online group discussions on the RD mailing list, and share resources on this site.
Seven Reasons Why: A User’s Guide to Transparency and Reproducibility Political Science
BITSS Catalyst Dalson Figueiredo Filho and colleagues present seven practical insights and recommendations in favor of research transparency and reproducibility in what is one of the first discussions of open science in Brazilian political science.
Registry for International Development Impact Evaluations (RIDIE) Economics and Finance+InterdisciplinaryPolitical ScienceRegistries

Administered by the International Initiative for Impact Evaluation (3ie), the Registry for International Development Impact Evaluations (RIDIE) is a registry of impact evaluations related to development in low and middle income countries. RIDIE will register any development impact evaluation that rigorously attempts to estimate the causal impacts of a program, including but not limited to randomized control trials. It is intended to be a prospective registry in which researchers and evaluators can record information about their evaluation designs before conducting the analysis, as well as update information as the study proceeds and post findings upon study completion.
Catalog of open source licenses InterdisciplinaryOpen Publishing

Using this online tool, you can choose an open source license to clearly articulate the conditions under which others can use, distribute, modify or contribute to your software and non-software projects.
ARDC FAIR Data self-assessment tool Data Management and De-identification+Interdisciplinary

This checklist, developed by the Australian Research Data Commons (ARDC) may help researchers make their datasets FAIRer: findable, accessible, interoperable and re-usable. Read More →
BITSS Pre- and post-training survey templates Interdisciplinary

BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Conda Data Visualization+InterdisciplinaryStatistics and Data Science

Conda is an open source package management system and environment management system that runs on Windows, macOS and Linux. Conda installs, runs and updates packages and their dependencies and is operable in multiple languages, including Python, R, Ruby, Lua, Scala, Java, JavaScript, C/ C++, FORTRAN.
Stage 1 Registered Report Submission Template Economics and Finance+Political SciencePre-Analysis PlansResults-Blind Review & Registered ReportsTransparent Reporting
BITSS prepared a template to assist authors in the preparation of their Stage 1 Proposal submissions to the Journal of Development Economics. The template expands on features that are commonly reported in pre-analysis plans in development economics, and includes a checklist to help authors record different parts of the research design.
Whole Tale Data Management and De-identification+Data VisualizationInterdisciplinaryReplicationsStatistics and Data ScienceVersion Control

Whole Tale is an infrastructure that allows users to share data, methods and analysis protocols, and final research outputs in a single, executable object (“living publication” or “tale”) alongside any research publication. Learn more here.
NRIN Collection of Resources on Research Integrity Data Management and De-identification+InterdisciplinaryMeta-AnalysesOpen PublishingRegistriesTransparent Reporting
Course materials: PhD Toolkit on Transparent, Open, and Reproducible Research Economics and Finance+Meta-AnalysesPre-Analysis PlansPublic PolicyRegistriesReplications
Catalyst Ada Gonzalez-Torres developed and delivered a PhD course on Transparent, Open, and Reproducible Research for PhD students at the European University Institute (EUI), in Florence, Italy. Find all course materials here.
RT2 Los Angeles Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Los Angeles, CA, September 5-7, 2018.
Find all resources from the training below:
Registry of Efficacy and Effectiveness Studies Education+Registries
The Registry of Efficacy and Effectiveness Studies (REES) is a registry for studies designed to establish causal conclusions in Education research. Eligible designs include randomized trials, quasi-experimental designs, regression discontinuity designs, and single-case designs.
RT2 Amsterdam Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Amsterdam, the Netherlands, in April 4-6, 2018.
Find all resources from the training below:
RT2 London Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in London, the UK, in September 20-22, 2017.
Find all resources from the training below:
RT2 Berkeley Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Berkeley, CA, in June 7-9, 2017.
Find all resources from the training below:
2016 Summer Institute Interdisciplinary
BITSS held its third Summer Institute in Berkeley, CA, in June 8-10, 2016.
Find all resources from the training below:
2015 Summer Institute Interdisciplinary
BITSS held its second Summer Institute in Berkeley, CA, in June 10-12, 2015.
Find all resources from the training below:
2014 Summer Institute Interdisciplinary
BITSS held its first Summer Institute in Berkeley, CA, in June 2-6, 2015.
Find all resources from the training below:
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management and De-identification+Dynamic Documents and Coding PracticesHealth SciencesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPre-Analysis PlansPreprintsPublic PolicyRegistriesReplicationsStatistical LiteracyTransparent ReportingVersion Control
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansReplicationsStatistical Literacy
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Registered Reports at the Journal of Development Economics Economics and Finance+Results-Blind Review & Registered Reports

As part of a pilot project, the Journal of Development Economics (JDE) now offers authors the opportunity to submit empirical research designs for review and approval before the results of the study are known. The pre-results review track is designed to award well-designed and well-executed studies regardless of whether their empirical results yield clear interpretations.
Learn more about the pilot in this blog post by JDE Editors Andrew Foster and Dean Karlan, and BITSS Faculty Director Edward Miguel.
COS Registered Reports information portal Interdisciplinary+Results-Blind Review & Registered Reports

The Center for Open Science (COS) has put together a portal containing information about the registered reports format of peer review and publication. The portal includes general information about registered reports, a list of journals that have implemented the format, an explanation of an appropriate workflow, resources for journal editors, motivation for funders, FAQs, and a list of allied initiatives, inlcuding those that focus on results-blind review and Exploratory Reports.
Mapping the Universe of Registered Reports Interdisciplinary+Results-Blind Review & Registered Reports
A preprint by Tom Hardwicke and John Ioannidis. Abstract: Selection pressures for significant results may infuse bias into the research process. We evaluated the implementation of one innovation designed to mitigate this bias, ‘Registered Reports’, where study protocols are peer-reviewed and granted in-principle acceptance (IPA) for publication before the study has been conducted. As of February 2018, 91 journals had adopted Registered Reports and 91 Final Reports had been published. Psychology journals are the principal adopters, but expansion has begun into medicine, social science, and other fields. Among 29 journals that responded to a survey, 334 protocols had been submitted to them, 87 had been granted IPA and 32 Final Reports had been published or were in press as of July 2017. We encountered several sub-optimal implementation practices, including non-availability of IPA protocols, and diverse approaches to protocol registration in the absence of a single central registry. Registered Reports should be iteratively evaluated and improved to ensure maximal benefits.
Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+Data RepositoriesData VisualizationDynamic Documents and Coding PracticesEconomics and FinanceEngineering and Computer ScienceHealth SciencesHumanitiesInterdisciplinaryIssues with transparency and reproducibilityLife SciencesLinguisticsMeta-AnalysesMetascience (Methods and Archival Science)Open PublishingOther Social SciencesPolitical SciencePower analysisPre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociologyStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control

A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.
AsPredicted.org Interdisciplinary+Registries
AsPredicted.org is “a standardized pre-registration that requires only what’s necessary to separate exploratory from confirmatory analyses.” You can easily generate a short and simple pre-registration document that “takes less effort to evaluate than it takes to evaluate the published study itself.” The form, designed by Uri Simonsohn, Joe Simmons, and Leif Nelson, has only nine questions, which are general enough that they are relevant to nearly all disciplines and types of research.
AEA Registry for RCTs Economics and Finance+Political SciencePre-Analysis PlansPsychologyPublic PolicyRegistries

The American Economic Association (AEA) Randomized Controlled Trials Registry is a registry for RCTs conducted in the social sciences. Registration is free and you do not need to be a member of the AEA to register. We encourage you to register any new study before data collection.
rOpenSci Packages Data Management and De-identification+Dynamic Documents and Coding PracticesInterdisciplinaryMeta-AnalysesMetascience (Methods and Archival Science)Power analysisReplicationsStatistics and Data ScienceVersion Control

These packages are carefully vetted, staff- and community-contributed R software tools that lower barriers to working with scientific data sources and data that support research applications on the web.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management and De-identification+Economics and FinanceInterdisciplinaryIssues with transparency and reproducibilityPolitical SciencePre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociology
Code Ocean (in beta) Data Repositories

Code Ocean is a cloud-based computational reproducibility platform that provides researchers and developers an easy way to share, discover and run code published in academic journals and conferences. Upload code and data in 10 programming languages and link working code in a computational environment with the associated article for free. Code Ocean assigns a Digital Object Identifier (DOI) to the algorithm, providing correct attribution and a connection to the published research.