Resource Library
The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.
Know of a great resource that we haven’t included or have questions about the existing resources? Email us!
Disseminate
Collect & Analyze Data
Data for Development Impact (Resource Guide) Data Management+EconomicsOther Social SciencesStatistics and Data Science
Open Science Module for Behavioral Science graduate course Economics+Psychology
Instructors Kelly Zhang (MIT GOV/LAB) and Chaning Jang (Busara) integrated a module on research transparency and the use of pre-analysis plans as part of the Behavioral Science in the Field course designed for graduate students who use behavioral science games as part of their research.
J-PAL Guide to De-Identifying Data Data Management+International Development
Developed by J-PAL’s Sarah Kooper, Anja Sautmann, and James Turrito, this guide includes:
- An overview of personally identifiable information (PII) and the responsibility of data users not to use data to try to identify human subjects
- Recommendations for handling direct identifiers (such as full name, social security number, or phone number), as well as indirect identifiers (such as month/year of birth, nationality, or gender)
- Guidance on de-identification steps to take throughout the research process, such as encrypting all data containing identifying information as soon as possible
- A list of common identifiers, including those labeled by the United States’ Health Insurance Portability and Accountability Act (HIPAA) guidelines as direct identifiers
- And more.
See also the accompanying Guide to Publishing Research Data.
J-PAL Guide to Publishing Research Data Data Management+International DevelopmentPublic Policy
Developed by J-PAL’s Sarah Kooper, Anja Sautmann, and James Turrito, this guide includes:
- A list of considerations to make before publishing data, such as what information was provided to study participants and the IRB, the sensitivity of the data collected, and legal requirements
- Sample consent form language that will allow future publication of de-identified data
- A checklist for preparing data for publication
- And more.
See also the accompanying Guide to De-identifying Data.
Data Sharing Checklist for NGOs and Practitioners Data Management+Interdisciplinary
This checklist developed by Teamscope can help NGOs and Practitioners understand the common pitfalls in open data, and how open data impacts every step of a project’s pipeline, from proposal writing to dissemination.
Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management+InterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesPower analysisPre-Analysis PlansPreprintsRegistriesReplicationsResults-Blind Review & Registered ReportsStatistical LiteracyTransparent ReportingVersion Control
BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.
RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).
Preregistration of secondary data analysis: A template and tutorial Interdisciplinary+Registries
Van den Akker and colleagues present a template specifically designed for the preregistration of secondary data analyses and provide comments and a practical example.
Replicability Seminar Issues with transparency and reproducibility+Statistical Literacy
Course syllabus for “Replicability Seminar”, an advanced undergraduate and graduate-level course led by Simine Vazire.
Open Data Metrics: Lighting the Fire Data Management+Interdisciplinary
In this book, Daniella Lowenberg and colleagues describe the journey towards open data metrics, prompting community discussion and providing implementation examples along the way. Data metrics are a pre-condition to realize the benefits of open data sharing practices.
BITSS Registered Reports Literature Review Economics+Pre-Analysis PlansResults-Blind Review & Registered Reports
Prepared by BITSS, this literature review includes information on the distinguishing features and advantages of registered reports, as well as challenges involved in its implementation.
Nextjournal Dynamic Documents and Coding Practices+Version Control
Nextjournal is a container tool with features like polyglot notebooks, automatic versioning and real-time collaboration.
Frontiers in Pre-Registration in Economics – Ted Miguel Economics+Pre-Analysis PlansRegistriesResults-Blind Review & Registered Reports
This presentation by Ted Miguel was given at the Transparency, Reproducibility and Credibility Research Symposium at the World Bank on 9/10/2019. You can find videos of other talks from the Symposium in this playlist.
Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices+Issues with transparency and reproducibility
Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
Software Carpentry Data Management+Dynamic Documents and Coding PracticesEngineering and Computer ScienceInterdisciplinaryStatistics and Data ScienceVersion Control
Software Carpentry offers online tutorials for data analysis including Version Control with Git, Using Databases and SQL, Programming with Python, Programming with R and Programming with MATLAB.
BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility
BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Data Carpentry Lessons Data Management+Interdisciplinary
Developed by Data Carpentry, these lessons can be used across the social sciences to teach data cleaning, management, analysis, and visualization. R is the base language for instruction, and there are no pre-requisites in terms of prior knowledge about this topic.
Observational PAP Guide Economics and Finance+Pre-Analysis Plans
In her preprint titled “Improving transparency in observational social science research: A pre-analysis plan approach”, Fiona Burlig (University of Chicago) presents three scenarios in which study preregistration and pre-analysis plans (PAPs) can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data. The preprint also includes suggested contents for observational PAPs, and highlights where observational PAPs should deviate from those designed for experimental research.
This work was also published in the journal Economics Letters.
Web Plot Digitizer Data Management+InterdisciplinaryMetascience (Methods and Archival Science)Statistics and Data Science
App extracts data from charts
ResonsibleData.io Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryMetascience (Methods and Archival Science)Statistics and Data Science
Using data for social change work offers many opportunities, but it brings challenges, too. The RD community develops practical ways to deal with the unintended consequences of using data in social change work, establishes best practices, and shares approaches between leading thinkers and doers from different sectors. We discuss thorny topics in-person, facilitate online group discussions on the RD mailing list, and share resources on this site.
Seven Reasons Why: A User’s Guide to Transparency and Reproducibility Political Science
BITSS Catalyst Dalson Figueiredo Filho and colleagues present seven practical insights and recommendations in favor of research transparency and reproducibility in what is one of the first discussions of open science in Brazilian political science.
Registry for International Development Impact Evaluations (RIDIE) Economics and Finance+InterdisciplinaryPolitical ScienceRegistries
Administered by the International Initiative for Impact Evaluation (3ie), the Registry for International Development Impact Evaluations (RIDIE) is a registry of impact evaluations related to development in low and middle income countries. RIDIE will register any development impact evaluation that rigorously attempts to estimate the causal impacts of a program, including but not limited to randomized control trials. It is intended to be a prospective registry in which researchers and evaluators can record information about their evaluation designs before conducting the analysis, as well as update information as the study proceeds and post findings upon study completion.
Catalog of open source licenses InterdisciplinaryOpen Publishing
Using this online tool, you can choose an open source license to clearly articulate the conditions under which others can use, distribute, modify or contribute to your software and non-software projects.
ARDC FAIR Data self-assessment tool Data Management+Interdisciplinary
This checklist, developed by the Australian Research Data Commons (ARDC) may help researchers make their datasets FAIRer: findable, accessible, interoperable and re-usable. Read More →
BITSS Pre- and post-training survey templates Interdisciplinary
BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Conda Data Visualization+InterdisciplinaryStatistics and Data Science
Conda is an open source package management system and environment management system that runs on Windows, macOS and Linux. Conda installs, runs and updates packages and their dependencies and is operable in multiple languages, including Python, R, Ruby, Lua, Scala, Java, JavaScript, C/ C++, FORTRAN.
Stage 1 Registered Report Submission Template Economics and Finance+Political SciencePre-Analysis PlansResults-Blind Review & Registered ReportsTransparent Reporting
BITSS prepared a template to assist authors in the preparation of their Stage 1 Proposal submissions to the Journal of Development Economics. The template expands on features that are commonly reported in pre-analysis plans in development economics, and includes a checklist to help authors record different parts of the research design.
Whole Tale Data Management+Data VisualizationInterdisciplinaryReplicationsStatistics and Data ScienceVersion Control
Whole Tale is an infrastructure that allows users to share data, methods and analysis protocols, and final research outputs in a single, executable object (“living publication” or “tale”) alongside any research publication. Learn more here.
NRIN Collection of Resources on Research Integrity Data Management+InterdisciplinaryMeta-AnalysesOpen PublishingRegistriesTransparent Reporting
Course materials: PhD Toolkit on Transparent, Open, and Reproducible Research Economics and Finance+Meta-AnalysesPre-Analysis PlansPublic PolicyRegistriesReplications
Catalyst Ada Gonzalez-Torres developed and delivered a PhD course on Transparent, Open, and Reproducible Research for PhD students at the European University Institute (EUI), in Florence, Italy. Find all course materials here.
RT2 Los Angeles Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Los Angeles, CA, September 5-7, 2018.
Find all resources from the training below:
Registry of Efficacy and Effectiveness Studies Education+Registries
The Registry of Efficacy and Effectiveness Studies (REES) is a registry for studies designed to establish causal conclusions in Education research. Eligible designs include randomized trials, quasi-experimental designs, regression discontinuity designs, and single-case designs.
RT2 Amsterdam Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Amsterdam, the Netherlands, in April 4-6, 2018.
Find all resources from the training below:
RT2 London Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in London, the UK, in September 20-22, 2017.
Find all resources from the training below:
RT2 Berkeley Interdisciplinary
BITSS held its Research Transparency and Reproducibility Training (RT2) in Berkeley, CA, in June 7-9, 2017.
Find all resources from the training below:
2016 Summer Institute Interdisciplinary
BITSS held its third Summer Institute in Berkeley, CA, in June 8-10, 2016.
Find all resources from the training below:
2015 Summer Institute Interdisciplinary
BITSS held its second Summer Institute in Berkeley, CA, in June 10-12, 2015.
Find all resources from the training below:
2014 Summer Institute Interdisciplinary
BITSS held its first Summer Institute in Berkeley, CA, in June 2-6, 2015.
Find all resources from the training below:
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management+Dynamic Documents and Coding PracticesHealth SciencesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPre-Analysis PlansPreprintsPublic PolicyRegistriesReplicationsStatistical LiteracyTransparent ReportingVersion Control
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansReplicationsStatistical Literacy
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Registered Reports at the Journal of Development Economics Economics and Finance+Results-Blind Review & Registered Reports
As part of a pilot project, the Journal of Development Economics (JDE) now offers authors the opportunity to submit empirical research designs for review and approval before the results of the study are known. The pre-results review track is designed to award well-designed and well-executed studies regardless of whether their empirical results yield clear interpretations.
Learn more about the pilot in this blog post by JDE Editors Andrew Foster and Dean Karlan, and BITSS Faculty Director Edward Miguel.
COS Registered Reports information portal Interdisciplinary+Results-Blind Review & Registered Reports
The Center for Open Science (COS) has put together a portal containing information about the registered reports format of peer review and publication. The portal includes general information about registered reports, a list of journals that have implemented the format, an explanation of an appropriate workflow, resources for journal editors, motivation for funders, FAQs, and a list of allied initiatives, inlcuding those that focus on results-blind review and Exploratory Reports.
Mapping the Universe of Registered Reports Interdisciplinary+Results-Blind Review & Registered Reports
A preprint by Tom Hardwicke and John Ioannidis. Abstract: Selection pressures for significant results may infuse bias into the research process. We evaluated the implementation of one innovation designed to mitigate this bias, ‘Registered Reports’, where study protocols are peer-reviewed and granted in-principle acceptance (IPA) for publication before the study has been conducted. As of February 2018, 91 journals had adopted Registered Reports and 91 Final Reports had been published. Psychology journals are the principal adopters, but expansion has begun into medicine, social science, and other fields. Among 29 journals that responded to a survey, 334 protocols had been submitted to them, 87 had been granted IPA and 32 Final Reports had been published or were in press as of July 2017. We encountered several sub-optimal implementation practices, including non-availability of IPA protocols, and diverse approaches to protocol registration in the absence of a single central registry. Registered Reports should be iteratively evaluated and improved to ensure maximal benefits.
Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+Data RepositoriesData VisualizationDynamic Documents and Coding PracticesEconomics and FinanceEngineering and Computer ScienceHealth SciencesHumanitiesInterdisciplinaryIssues with transparency and reproducibilityLife SciencesLinguisticsMeta-AnalysesMetascience (Methods and Archival Science)Open PublishingOther Social SciencesPolitical SciencePower analysisPre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociologyStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control
A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.
AsPredicted.org Interdisciplinary+Registries
AsPredicted.org is “a standardized pre-registration that requires only what’s necessary to separate exploratory from confirmatory analyses.” You can easily generate a short and simple pre-registration document that “takes less effort to evaluate than it takes to evaluate the published study itself.” The form, designed by Uri Simonsohn, Joe Simmons, and Leif Nelson, has only nine questions, which are general enough that they are relevant to nearly all disciplines and types of research.
AEA Registry for RCTs Economics and Finance+Political SciencePre-Analysis PlansPsychologyPublic PolicyRegistries
The American Economic Association (AEA) Randomized Controlled Trials Registry is a registry for RCTs conducted in the social sciences. Registration is free and you do not need to be a member of the AEA to register. We encourage you to register any new study before data collection.
rOpenSci Packages Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryMeta-AnalysesMetascience (Methods and Archival Science)Power analysisReplicationsStatistics and Data ScienceVersion Control
These packages are carefully vetted, staff- and community-contributed R software tools that lower barriers to working with scientific data sources and data that support research applications on the web.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management+Economics and FinanceInterdisciplinaryIssues with transparency and reproducibilityPolitical SciencePre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociology
Code Ocean (in beta) Data Repositories
Code Ocean is a cloud-based computational reproducibility platform that provides researchers and developers an easy way to share, discover and run code published in academic journals and conferences. Upload code and data in 10 programming languages and link working code in a computational environment with the associated article for free. Code Ocean assigns a Digital Object Identifier (DOI) to the algorithm, providing correct attribution and a connection to the published research.
TOP Guidelines InterdisciplinaryTransparent Reporting
Transparency and Openness Promotion (TOP) Guidelines are a set eight modular transparency standards for academic journals, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to adopt for their journal, and select a level of implementation for the selected standards. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.
EQUATOR Network Health Sciences+Life SciencesPsychologyTransparent Reporting
The EQUATOR Network is an organization that tracks reporting guidelines across numerous types of studies. They currently have suggestions for over 275 guidelines depending on what type of research you are engaged in.
CONSORT Statement Health Sciences+PsychologyTransparent Reporting
The 2010 CONSORT Statement is a widely adopted set of recommendations for randomized trial reporting. It includes a concise reporting checklist for researchers to follow, and has been published in the British Medical Journal, the Lancet, and PLoS Medicine.
PRISMA Interdisciplinary+Metascience (Methods and Archival Science)Transparent Reporting
PRISMA is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA focuses on the reporting of reviews evaluating randomized trials, but can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions.
STROBE Statement Health Sciences+Transparent Reporting
The STROBE Statement is a reporting guideline written for observational studies in epidemiology. It incorporates a checklist of 22 items considered essential for observational study reporting.
SPARC (Scholarly Publishing and Academic Resources Coalition) Data Management+Transparent Reporting
This community resource for tracking, comparing, and understanding both current and future U.S. federal funder research data sharing policies is a joint project of SPARC & Johns Hopkins University Libraries.
Royal Society Open Science Registered Reports Health Sciences+Other Social SciencesPre-Analysis PlansPsychologyReplicationsResults-Blind Review & Registered Reports
The Royal Society Open Science is a fast, open journal publishing high quality research across all of science, engineering and mathematics. A Registered Report (RR) is a form of journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted (stage 1). High quality protocols are then provisionally accepted for publication before data collection commences. The format is open to attempts of replication as well as novel studies. Once the study is completed, the author will finish the article including results and discussion sections (stage 2). This will be appraised by the reviewers, and provided necessary conditions are met, will be published.
Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices+Open PublishingPsychologyReplications
The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.
Go Fishing App Interdisciplinary+Political Science
If you get to choose your tests after you see the data, you can get whatever results you like. To see the logic try out this fishy test.