Resource Library
The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.
Know of a great resource that we haven’t included or have questions about the existing resources? Email us!
Disseminate
Collect & Analyze Data
BITSS Registered Reports Literature Review Economics
Nextjournal Dynamic Documents and Coding Practices

Nextjournal is a container tool with features like polyglot notebooks, automatic versioning and real-time collaboration.
Frontiers in Pre-Registration in Economics – Ted Miguel Economics
This presentation by Ted Miguel was given at the Transparency, Reproducibility and Credibility Research Symposium at the World Bank on 9/10/2019. You can find videos of other talks from the Symposium in this playlist.
Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices
Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
Software Carpentry Data Management and De-identification
Software Carpentry offers online tutorials for data analysis including Version Control with Git, Using Databases and SQL, Programming with Python, Programming with R and Programming with MATLAB.
BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility
BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Observational PAP Guide Economics and Finance
In her preprint titled “Improving transparency in observational social science research: A pre-analysis plan approach”, Fiona Burlig (University of Chicago) presents three scenarios in which study preregistration and pre-analysis plans (PAPs) can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data. The preprint also includes suggested contents for observational PAPs, and highlights where observational PAPs should deviate from those designed for experimental research.
This work was also published in the journal Economics Letters.
ResonsibleData.io Data Management and De-identification
Using data for social change work offers many opportunities, but it brings challenges, too. The RD community develops practical ways to deal with the unintended consequences of using data in social change work, establishes best practices, and shares approaches between leading thinkers and doers from different sectors. We discuss thorny topics in-person, facilitate online group discussions on the RD mailing list, and share resources on this site.
Seven Reasons Why: A User’s Guide to Transparency and Reproducibility Political Science
BITSS Catalyst Dalson Figueiredo Filho and colleagues present seven practical insights and recommendations in favor of research transparency and reproducibility in what is one of the first discussions of open science in Brazilian political science.
Registry for International Development Impact Evaluations (RIDIE) Economics and Finance

Administered by the International Initiative for Impact Evaluation (3ie), the Registry for International Development Impact Evaluations (RIDIE) is a registry of impact evaluations related to development in low and middle income countries. RIDIE will register any development impact evaluation that rigorously attempts to estimate the causal impacts of a program, including but not limited to randomized control trials. It is intended to be a prospective registry in which researchers and evaluators can record information about their evaluation designs before conducting the analysis, as well as update information as the study proceeds and post findings upon study completion.
Catalog of open source licenses InterdisciplinaryOpen Publishing

Using this online tool, you can choose an open source license to clearly articulate the conditions under which others can use, distribute, modify or contribute to your software and non-software projects.
ARDC FAIR Data self-assessment tool Data Management and De-identification

This checklist, developed by the Australian Research Data Commons (ARDC) may help researchers make their datasets FAIRer: findable, accessible, interoperable and re-usable. Read More →
BITSS Pre- and post-training survey templates Interdisciplinary

BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Conda Data Visualization

Conda is an open source package management system and environment management system that runs on Windows, macOS and Linux. Conda installs, runs and updates packages and their dependencies and is operable in multiple languages, including Python, R, Ruby, Lua, Scala, Java, JavaScript, C/ C++, FORTRAN.
Stage 1 Registered Report Submission Template Economics and Finance
BITSS prepared a template to assist authors in the preparation of their Stage 1 Proposal submissions to the Journal of Development Economics. The template expands on features that are commonly reported in pre-analysis plans in development economics, and includes a checklist to help authors record different parts of the research design.
Whole Tale Data Management and De-identification

Whole Tale is an infrastructure that allows users to share data, methods and analysis protocols, and final research outputs in a single, executable object (“living publication” or “tale”) alongside any research publication. Learn more here.
NRIN Collection of Resources on Research Integrity Data Management and De-identification
Course materials: PhD Toolkit on Transparent, Open, and Reproducible Research Economics and Finance
Catalyst Ada Gonzalez-Torres developed and delivered a PhD course on Transparent, Open, and Reproducible Research for PhD students at the European University Institute (EUI), in Florence, Italy. Find all course materials here.
RT2 Los Angeles Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Los Angeles, CA, September 5-7, 2018.
Find all resources from the training below:
RT2 Amsterdam Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Amsterdam, the Netherlands, in April 4-6, 2018.
Find all resources from the training below:
RT2 London Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in London, the UK, in September 20-22, 2017.
Find all resources from the training below:
RT2 Berkeley Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Berkeley, CA, in June 7-9, 2017.
Find all resources from the training below:
2016 Summer Institute Interdisciplinary
BITSS held its third Summer Institute in Berkeley, CA, in June 8-10, 2016.
Find all resources from the training below:
2015 Summer Institute Interdisciplinary
BITSS held its second Summer Institute in Berkeley, CA, in June 10-12, 2015.
Find all resources from the training below:
2014 Summer Institute Interdisciplinary
BITSS held its first Summer Institute in Berkeley, CA, in June 2-6, 2015.
Find all resources from the training below:
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management and De-identification
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Registered Reports at the Journal of Development Economics Economics and Finance

As part of a pilot project, the Journal of Development Economics (JDE) now offers authors the opportunity to submit empirical research designs for review and approval before the results of the study are known. The pre-results review track is designed to award well-designed and well-executed studies regardless of whether their empirical results yield clear interpretations.
Learn more about the pilot in this blog post by JDE Editors Andrew Foster and Dean Karlan, and BITSS Faculty Director Edward Miguel.
COS Registered Reports information portal Interdisciplinary

The Center for Open Science (COS) has put together a portal containing information about the registered reports format of peer review and publication. The portal includes general information about registered reports, a list of journals that have implemented the format, an explanation of an appropriate workflow, resources for journal editors, motivation for funders, FAQs, and a list of allied initiatives, inlcuding those that focus on results-blind review and Exploratory Reports.
Mapping the Universe of Registered Reports Interdisciplinary
A preprint by Tom Hardwicke and John Ioannidis. Abstract: Selection pressures for significant results may infuse bias into the research process. We evaluated the implementation of one innovation designed to mitigate this bias, ‘Registered Reports’, where study protocols are peer-reviewed and granted in-principle acceptance (IPA) for publication before the study has been conducted. As of February 2018, 91 journals had adopted Registered Reports and 91 Final Reports had been published. Psychology journals are the principal adopters, but expansion has begun into medicine, social science, and other fields. Among 29 journals that responded to a survey, 334 protocols had been submitted to them, 87 had been granted IPA and 32 Final Reports had been published or were in press as of July 2017. We encountered several sub-optimal implementation practices, including non-availability of IPA protocols, and diverse approaches to protocol registration in the absence of a single central registry. Registered Reports should be iteratively evaluated and improved to ensure maximal benefits.
Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography

A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.
AsPredicted.org Interdisciplinary
AsPredicted.org is “a standardized pre-registration that requires only what’s necessary to separate exploratory from confirmatory analyses.” You can easily generate a short and simple pre-registration document that “takes less effort to evaluate than it takes to evaluate the published study itself.” The form, designed by Uri Simonsohn, Joe Simmons, and Leif Nelson, has only nine questions, which are general enough that they are relevant to nearly all disciplines and types of research.
AEA RCT Economics and Finance

The American Economic Association (AEA) Randomized Controlled Trials Registry is a registry for RCTs conducted in the social sciences. Because existing registries were not well suited to the need for social sciences, in April 2012, the AEA executive committee decided to establish such a registry for economics and other social sciences.
rOpenSci Packages Data Management and De-identification

These packages are carefully vetted, staff- and community-contributed R software tools that lower barriers to working with scientific data sources and data that support research applications on the web.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management and De-identification
TOP Guidelines InterdisciplinaryTransparent Reporting
Transparency and Openness Promotion (TOP) Guidelines are a set eight modular transparency standards for academic journals, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to adopt for their journal, and select a level of implementation for the selected standards. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.
EQUATOR Network Health Sciences
The EQUATOR Network is an organization that tracks reporting guidelines across numerous types of studies. They currently have suggestions for over 275 guidelines depending on what type of research you are engaged in.
CONSORT Statement Health Sciences
The 2010 CONSORT Statement is a widely adopted set of recommendations for randomized trial reporting. It includes a concise reporting checklist for researchers to follow, and has been published in the British Medical Journal, the Lancet, and PLoS Medicine.
PRISMA Interdisciplinary

PRISMA is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA focuses on the reporting of reviews evaluating randomized trials, but can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions.
STROBE Statement Health Sciences
The STROBE Statement is a reporting guideline written for observational studies in epidemiology. It incorporates a checklist of 22 items considered essential for observational study reporting.
SPARC (Scholarly Publishing and Academic Resources Coalition) Data Management and De-identification

This community resource for tracking, comparing, and understanding both current and future U.S. federal funder research data sharing policies is a joint project of SPARC & Johns Hopkins University Libraries.
Royal Society Open Science Registered Reports Health Sciences

The Royal Society Open Science is a fast, open journal publishing high quality research across all of science, engineering and mathematics. A Registered Report (RR) is a form of journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted (stage 1). High quality protocols are then provisionally accepted for publication before data collection commences. The format is open to attempts of replication as well as novel studies. Once the study is completed, the author will finish the article including results and discussion sections (stage 2). This will be appraised by the reviewers, and provided necessary conditions are met, will be published.
Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices

The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.
Go Fishing App Interdisciplinary

If you get to choose your tests after you see the data, you can get whatever results you like. To see the logic try out this fishy test.
statcheck Wep App Interdisciplinary

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.
Page 1 of 3