Resource Library
The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.
Know of a great resource that we haven’t included or have questions about the existing resources? Email us!
Disseminate
Collect & Analyze Data
Social Science Reproduction Platform Economics+Issues with transparency and reproducibilityMetascience (Methods and Archival Science)Other Social SciencesPolitical SciencePsychologyPublic HealthPublic PolicyReplicationsReproducibilitySociologyStatistics and Data Science
Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management+InterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesPower analysisPre-Analysis PlansPreprintsRegistriesReplicationsResults-Blind Review & Registered ReportsStatistical LiteracyTransparent ReportingVersion Control
BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.
RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).
Whole Tale Data Management+Data VisualizationInterdisciplinaryReplicationsStatistics and Data ScienceVersion Control
Whole Tale is an infrastructure that allows users to share data, methods and analysis protocols, and final research outputs in a single, executable object (“living publication” or “tale”) alongside any research publication. Learn more here.
Course materials: PhD Toolkit on Transparent, Open, and Reproducible Research Economics and Finance+Meta-AnalysesPre-Analysis PlansPublic PolicyRegistriesReplications
Catalyst Ada Gonzalez-Torres developed and delivered a PhD course on Transparent, Open, and Reproducible Research for PhD students at the European University Institute (EUI), in Florence, Italy. Find all course materials here.
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management+Dynamic Documents and Coding PracticesHealth SciencesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPre-Analysis PlansPreprintsPublic PolicyRegistriesReplicationsStatistical LiteracyTransparent ReportingVersion Control
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansReplicationsStatistical Literacy
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+Data RepositoriesData VisualizationDynamic Documents and Coding PracticesEconomics and FinanceEngineering and Computer ScienceHealth SciencesHumanitiesInterdisciplinaryIssues with transparency and reproducibilityLife SciencesLinguisticsMeta-AnalysesMetascience (Methods and Archival Science)Open PublishingOther Social SciencesPolitical SciencePower analysisPre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociologyStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control
A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.
rOpenSci Packages Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryMeta-AnalysesMetascience (Methods and Archival Science)Power analysisReplicationsStatistics and Data ScienceVersion Control
These packages are carefully vetted, staff- and community-contributed R software tools that lower barriers to working with scientific data sources and data that support research applications on the web.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management+Economics and FinanceInterdisciplinaryIssues with transparency and reproducibilityPolitical SciencePre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociology
Royal Society Open Science Registered Reports Health Sciences+Other Social SciencesPre-Analysis PlansPsychologyReplicationsResults-Blind Review & Registered Reports
The Royal Society Open Science is a fast, open journal publishing high quality research across all of science, engineering and mathematics. A Registered Report (RR) is a form of journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted (stage 1). High quality protocols are then provisionally accepted for publication before data collection commences. The format is open to attempts of replication as well as novel studies. Once the study is completed, the author will finish the article including results and discussion sections (stage 2). This will be appraised by the reviewers, and provided necessary conditions are met, will be published.
Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices+Open PublishingPsychologyReplications
The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.
statcheck Wep App Interdisciplinary+Metascience (Methods and Archival Science)PsychologyReplicationsTransparent Reporting
statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.
Retraction Watch Interdisciplinary+Replications
Retraction Watch is a blog that reports on retractions of scientific papers, as a window into the scientific process.
Nicebread Data Management+Data VisualizationDynamic Documents and Coding PracticesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPower analysisPre-Analysis PlansPreprintsPsychologyRegistriesReplicationsResults-Blind Review & Registered ReportsTransparent ReportingVersion Control
Dr. Felix Schönbrodt’s blog promoting research transparency and open science.
Jupyter Notebooks Data Visualization+InterdisciplinaryReplicationsStatistics and Data ScienceVersion Control
The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and explanatory text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, machine learning and much more.
Docker Data Visualization+InterdisciplinaryReplicationsVersion Control
Docker is the world’s leading software container platform. Developers use Docker to eliminate “works on my machine” problems when collaborating on code with co-workers. Operators use Docker to run and manage apps side-by-side in isolated containers to get better compute density. Enterprises use Docker to build agile software delivery pipelines to ship new features faster, more securely and with confidence for both Linux and Windows Server apps.
NeuroChambers Issues with transparency and reproducibility+Open PublishingPower analysisPre-Analysis PlansPsychologyReplicationsResults-Blind Review & Registered ReportsTransparent Reporting
Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.
The New Statistics (+OSF Learning Page) Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryMeta-AnalysesOpen PublishingPower analysisPre-Analysis PlansPsychologyReplicationsStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control
This OSF project helps organize resources for teaching the “New Statistics” — an approach that emphasizes asking quantitative questions, focusing on effect sizes, using confidence intervals to express uncertainty about effect sizes, using modern data visualizations, seeking replication, and using meta-analysis as a matter of course.
statcheck Interdisciplinary+Metascience (Methods and Archival Science)PsychologyReplicationsTransparent Reporting
statcheck is an R package that checks for errors in statistical reporting in APA-formatted documents. It can help estimate the prevalence of reporting errors and is a tool to check your own work before submitting. The package can be used to automatically extract statistics from articles and recompute p values. It is also available as a wep app.
re3data.org Data Repositories+InterdisciplinaryReplications
The Registry of Research Data Repositories (re3data.org) is a global registry of research data repositories that covers research data repositories from different academic disciplines. It presents repositories for the permanent storage and access of data sets to researchers, funding bodies, publishers and scholarly institutions. re3data.org promotes a culture of sharing, increased access and better visibility of research data. The registry went live in autumn 2012 and is funded by the German Research Foundation (DFG).
Transparent and Open Social Science Research Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansRegistriesReplicationsStatistical LiteracyTransparent Reporting
Demand is growing for evidence-based policymaking, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
You can access the course videos for self-paced learning on the BITSS YouTube channel here, (also available with subtitles in French here). You can also enroll for free during curated course runs on the FutureLearn platform.
Curate Science Issues with transparency and reproducibility+Metascience (Methods and Archival Science)PsychologyReplicationsSociology
Curate Science is a crowd-sourced platform to track, organize, and interpret replications of published findings in the social sciences. Curated replication study characteristics include links to PDFs, open/public data, open/public materials, pre-registered protocols, independent variables (IVs), outcome variables (DVs), replication type, replication design differences, and links to associated evidence collections that feature meta-analytic forest plots.
Political Science Replication Dynamic Documents and Coding Practices+Replications
Political Science Replication is a blog about reproducibility, replication, pre-registration, research transparency and open peer review.
Replication Network Economics and Finance+PsychologyPublic PolicyReplications
The Replication Network is a group of economists dedicated to promoting the practice of replication in the field of economics.
Replication Wiki Economics and Finance+PsychologyReplications
ReplicationWiki is a wiki-based service which lists and provides links to replications of empirical studies in economics, studies which have yet to be replicated, and material to assist with replication.
Impact Evaluation Replication Programme Economics and Finance+Political SciencePublic PolicyReplications
International Initiative for Impact Evaluation (3ie) Replication Grant funds replications. Funding requests are reviewed on a rolling basis. High quality applicants are invited to submit full proposals.
Edawax Replications
Edawax conducts meta-research on a variety of topics related to research practices – including an analysis of the data sharing policies of peer-reviewed journals – with the hope of 1) gaining the insight to identify the obstacles to performing replications and 2) using those insights to develop resources and infrastructure to facilitate replications and meta-analysis.
Experimental Lab Standard Operating Procedures Data Management+Meta-AnalysesPolitical SciencePre-Analysis PlansReplicationsTransparent Reporting
This standard operating procedure (SOP) document describes the default practices of the experimental research group led by Donald P. Green at Columbia University. These defaults apply to analytic decisions that have not been made explicit in pre-analysis plans (PAPs). They are not meant to override decisions that are laid out in PAPs. The contents of our lab’s SOP available for public use. We welcome others to copy or adapt it to suit their research purposes.