Resource Library
The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.
Know of a great resource that we haven’t included or have questions about the existing resources? Email us!
Disseminate
Collect & Analyze Data
Social Science Reproduction Platform Economics+Issues with transparency and reproducibilityMetascience (Methods and Archival Science)Other Social SciencesPolitical SciencePsychologyPublic HealthPublic PolicyReplicationsReproducibilitySociologyStatistics and Data Science
Open Research Calendar Data Management+Issues with transparency and reproducibilityOpen PublishingOpen ScienceReproducibilityStatistical Literacy
Open Research Calendar is an open-source community tool that collates information on worldwide events related to open science and research.
An Introduction to Open Science Interdisciplinary+Open Science
This presentation by Felix Schönbrodt gives an overview of the motivation for open science and an introduction to the research tools and practices commonly associated with open science. The slides are can be re-used and distributed under the CC BY license.
Framework for Open and Reproducible Research Training (FORRT) Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryIssues with transparency and reproducibilityPre-Analysis PlansStatistical LiteracyTransparent Reporting
FORRT is a pedagogical infrastructure designed to recognize and support the teaching and mentoring of open and reproducible science tenets in tandem with prototypical subject matters in higher education. FORRT also advocates for the opening of teaching and mentoring materials as a means to facilitate access, discovery, and learning to those who otherwise would be educationally disenfranchised.
Changing Incentives Toward Transparency Issues with transparency and reproducibilityTransparency
Find slides from a presentation by Brian Nosek titled “Changing Incentives Toward Transparency”.
Framing Transparency in Research: Issues and Opportunities Issues with transparency and reproducibility+Transparency
Find slides from a presentation by Victoria Stodden titled “Framing Transparency in Research: Issues and Opportunities”.
Promoting Transparency in the Social Sciences Issues with transparency and reproducibility+Social ScienceTransparency
Find slides from a presentation by Edward Miguel titled “Promoting Transparency in Social Science Research”.
Research Transparency Overview (French) Issues with transparency and reproducibility
Find slides from a presentation by Zachary Tsala Dimbuene titled “Research Transparency Overview (French)”.
Implementing an RTR Strategy Issues with transparency and reproducibility
Find slides from a presentation by Arnaud Vaganay titled “Implementing an RTR Strategy”.
Drafting RTR Guidelines Issues with transparency and reproducibility
Find slides from a presentation by Arnaud Vaganay titled “Drafting RTR Guidelines”.
Research Transparency and Reproducibility (RTR) Issues with transparency and reproducibility
Find slides from a presentation by Arnaud Vaganay titled “Research Transparency and Reproducibility (RTR)”.
Open Science Success Stories Data Management+Issues with transparency and reproducibility
The Open Research Funders Group curates the Open Science Success Stories, a database of examples of how openness has benefited researchers and broader society.
Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management+InterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesPower analysisPre-Analysis PlansPreprintsRegistriesReplicationsResults-Blind Review & Registered ReportsStatistical LiteracyTransparent ReportingVersion Control
BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.
RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).
Replicability Seminar Issues with transparency and reproducibility+Statistical Literacy
Course syllabus for “Replicability Seminar”, an advanced undergraduate and graduate-level course led by Simine Vazire.
BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility
BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.
The links below enable access as an editor; please make a copy of each form to use it for your own purposes:
Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices+Issues with transparency and reproducibility
Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management+Dynamic Documents and Coding PracticesHealth SciencesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPre-Analysis PlansPreprintsPublic PolicyRegistriesReplicationsStatistical LiteracyTransparent ReportingVersion Control
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansReplicationsStatistical Literacy
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+Data RepositoriesData VisualizationDynamic Documents and Coding PracticesEconomics and FinanceEngineering and Computer ScienceHealth SciencesHumanitiesInterdisciplinaryIssues with transparency and reproducibilityLife SciencesLinguisticsMeta-AnalysesMetascience (Methods and Archival Science)Open PublishingOther Social SciencesPolitical SciencePower analysisPre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociologyStatistical LiteracyStatistics and Data ScienceTransparent ReportingVersion Control
A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management+Economics and FinanceInterdisciplinaryIssues with transparency and reproducibilityPolitical SciencePre-Analysis PlansPsychologyPublic PolicyRegistriesReplicationsSociology
Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices+Open PublishingPsychologyReplications
The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.
Go Fishing App Interdisciplinary+Political Science
If you get to choose your tests after you see the data, you can get whatever results you like. To see the logic try out this fishy test.
statcheck Wep App Interdisciplinary+Metascience (Methods and Archival Science)PsychologyReplicationsTransparent Reporting
statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.
Retraction Watch Interdisciplinary+Replications
Retraction Watch is a blog that reports on retractions of scientific papers, as a window into the scientific process.
Improving Your Statistical Inference Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityPower analysisPsychologyStatistical Literacy
This course aims to help you to draw better statistical inferences from empirical research. Students discuss how to correctly interpret p-values, effect sizes, confidence intervals, Bayes Factors, and likelihood ratios, and how these statistics answer different questions you might be interested in. Then, they learn how to design experiments where the false positive rate is controlled, and how to decide upon the sample size for a study, for example in order to achieve high statistical power. Subsequently, students learn how to interpret evidence in the scientific literature given widespread publication bias, for example by learning about p-curve analysis. Finally, the course discusses how to do philosophy of science, theory construction, and cumulative science, including how to perform replication studies, why and how to pre-register an experiment, and how to share results following Open Science principles.
Nicebread Data Management+Data VisualizationDynamic Documents and Coding PracticesInterdisciplinaryIssues with transparency and reproducibilityMeta-AnalysesOpen PublishingPower analysisPre-Analysis PlansPreprintsPsychologyRegistriesReplicationsResults-Blind Review & Registered ReportsTransparent ReportingVersion Control
Dr. Felix Schönbrodt’s blog promoting research transparency and open science.
NeuroChambers Issues with transparency and reproducibility+Open PublishingPower analysisPre-Analysis PlansPsychologyReplicationsResults-Blind Review & Registered ReportsTransparent Reporting
Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.
rpsychologist Data Management+Dynamic Documents and Coding PracticesInterdisciplinaryIssues with transparency and reproducibilityOpen PublishingPsychology
Kristoffer Magnusson’s blog about R, Statistics, Psychology, Open Science, and Data Visualization.
p-uniform Interdisciplinary+Meta-AnalysesMetascience (Methods and Archival Science)
The p-uniform package provides meta-analysis methods that correct for publication bias. Three methods are currently included in the package. The p-uniform method can be used for estimating effect size, testing the null hypothesis of no effect, and testing for publication bias. The second method in the package is the hybrid method. The hybrid method is a meta-analysis method for combining an original study and replication and while taking into account statistical significance of the original study. The p-uniform and hybrid method are based on the statistical theory that the distribution of p-values is uniform conditional on the population effect size. The third method in the package is the Snapshot Bayesian Hybrid Meta-Analysis Method. This method computes posterior probabilities for four true effect sizes (no, small, medium, and large) based on an original study and replication while taking into account publication bias in the original study. The method can also be used for computing the required sample size of the replication akin to power analysis in null hypothesis significance testing.
p-curve Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMetascience (Methods and Archival Science)Power analysisStatistics and Data Science
P-curve is a tool for determining if reported effects in literature are true or if they merely reflect selective reporting. P-curve is the distribution of statistically significant p-values for a set of studies (ps < .05). Because only true effects are expected to generate right-skewed p-curves – containing more low (.01s) than high (.04s) significant p-values – only right-skewed p-curves are diagnostic of evidential value. By telling us whether we can rule out selective reporting as the sole explanation for a set of findings, p-curve offers a solution to the age-old inferential problems caused by file-drawers of failed studies and analyses.
statcheck Interdisciplinary+Metascience (Methods and Archival Science)PsychologyReplicationsTransparent Reporting
statcheck is an R package that checks for errors in statistical reporting in APA-formatted documents. It can help estimate the prevalence of reporting errors and is a tool to check your own work before submitting. The package can be used to automatically extract statistics from articles and recompute p values. It is also available as a wep app.
Transparent and Open Social Science Research Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityMeta-AnalysesPre-Analysis PlansRegistriesReplicationsStatistical LiteracyTransparent Reporting
Demand is growing for evidence-based policymaking, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.
You can access the course videos for self-paced learning on the BITSS YouTube channel here, (also available with subtitles in French here). You can also enroll for free during curated course runs on the FutureLearn platform.
Manual of Best Practices Dynamic Documents and Coding Practices+Issues with transparency and reproducibilityPre-Analysis PlansTransparent Reporting
Manual of Best Practices, written by Garret Christensen (BITSS), is a working guide to the latest best practices for transparent quantitative social science research. The manual is also available, and occasionally updated on GitHub. For suggestions or feedback, contact garret@berkeley.edu.
Curate Science Issues with transparency and reproducibility+Metascience (Methods and Archival Science)PsychologyReplicationsSociology
Curate Science is a crowd-sourced platform to track, organize, and interpret replications of published findings in the social sciences. Curated replication study characteristics include links to PDFs, open/public data, open/public materials, pre-registered protocols, independent variables (IVs), outcome variables (DVs), replication type, replication design differences, and links to associated evidence collections that feature meta-analytic forest plots.
Promise and Perils of Pre-Analysis Plans Economics and Finance+Issues with transparency and reproducibilityPre-Analysis PlansRegistries
Promise and Perils of Pre-analysis Plans, by Ben Olken lays out many of the items to include in a pre-analysis plan, as well as their history, the benefits, and a few potential drawbacks. Pre-analysis plans can be especially useful in reaching agreement about what will be measured and how when a partner or funder has a vested interest in the outcome of a study.
Reshaping Institutions Economics and Finance+Issues with transparency and reproducibilityPolitical SciencePre-Analysis PlansStatistical Literacy
Reshaping Institutions is a paper by Katherine Casey, Rachel Glennerster, and Edward Miguel that uses a pre-analysis plan to analyze the effects of a community driven development program in Sierra Leone. They discuss the contents and benefits of a PAP in detail, and include a “cherry-picking” table that shows the wide flexibility of analysis that is possible without pre-specification. The PAP itself is included in Appendix A in the supplementary materials, available at the link above.