Resource Library

The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.

Know of a great resource that we haven’t included or have questions about the existing resources? Email us!

  • Topic

  • Type

  • Discipline

36 Results

Social Science Reproduction Platform Economics+

The Social Science Reproduction Platform crowdsources and catalogs attempts to assess and improve the computational reproducibility of social science research. Instructors can use the SSRP in applied social science courses at the graduate or undergraduate levels to teach fundamental concepts, methods, and reproducible research practices. Get started by creating a free account and browsing some of the completed reproductions! Instructors can start by reviewing the guide for instructors, which contains tips and resources for teaching and grading reproductions using the platform.
Read More →

Open Research Calendar Data Management+

Open Research Calendar is an open-source community tool that collates information on worldwide events related to open science and research.

An Introduction to Open Science Interdisciplinary+

This presentation by Felix Schönbrodt gives an overview of the motivation for open science and an introduction to the research tools and practices commonly associated with open science. The slides are can be re-used and distributed under the CC BY license.

Framework for Open and Reproducible Research Training (FORRT) Data Management+

FORRT is a pedagogical infrastructure designed to recognize and support the teaching and mentoring of open and reproducible science tenets in tandem with prototypical subject matters in higher education. FORRT also advocates for the opening of teaching and mentoring materials as a means to facilitate access, discovery, and learning to those who otherwise would be educationally disenfranchised.

 

Read More →

Implementing an RTR Strategy Issues with transparency and reproducibility

Find slides from a presentation by Arnaud Vaganay titled “Implementing an RTR Strategy”.

Drafting RTR Guidelines Issues with transparency and reproducibility

Find slides from a presentation by Arnaud Vaganay titled “Drafting RTR Guidelines”.

Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management+

BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.

RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).

Read More →

Replicability Seminar Issues with transparency and reproducibility+

Course syllabus for “Replicability Seminar”, an advanced undergraduate and graduate-level course led by Simine Vazire.

BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility

BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.

The links below enable access as an editor; please make a copy of each form to use it for your own purposes:

Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices+

Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.

Read More →

PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management+

BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.

Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+

These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.

These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.

Read More →

Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+

A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.

Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management+

Created by the Policy Design and Evaluation Lab (PDEL) at UCSD, this teaching module was developed to demonstrate the credibility crisis in the social sciences caused by a variety of incentives and practices at both the disciplinary and individual levels, and provide practical steps for researchers to improve the credibility of their work throughout the lifecycle of a project. It is intended for use in graduate-level social science methodology courses—including those in political science, economics, sociology, and psychology—at UCSD and beyond.
These materials were developed as part of a BITSS Catalyst Training Project “Creating Pedagogical Materials to Enhance Research Transparency at UCSD” led by Catalysts Scott Desposato and Craig McIntosh along with Julia Clark, PhD candidate at UCSD.
Read More →

Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices+

The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.

 

Read More →

Go Fishing App Interdisciplinary+

If you get to choose your tests after you see the data, you can get whatever results you like. To see the logic try out this fishy test.

statcheck Wep App Interdisciplinary+

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.

Retraction Watch Interdisciplinary+

Retraction Watch is a blog that reports on retractions of scientific papers, as a window into the scientific process.

Improving Your Statistical Inference Dynamic Documents and Coding Practices+

This course aims to help you to draw better statistical inferences from empirical research. Students discuss how to correctly interpret p-values, effect sizes, confidence intervals, Bayes Factors, and likelihood ratios, and how these statistics answer different questions you might be interested in. Then, they learn how to design experiments where the false positive rate is controlled, and how to decide upon the sample size for a study, for example in order to achieve high statistical power. Subsequently, students learn how to interpret evidence in the scientific literature given widespread publication bias, for example by learning about p-curve analysis. Finally, the course discusses how to do philosophy of science, theory construction, and cumulative science, including how to perform replication studies, why and how to pre-register an experiment, and how to share results following Open Science principles.

Read More →

Nicebread Data Management+

Dr. Felix Schönbrodt’s blog promoting research transparency and open science.

NeuroChambers Issues with transparency and reproducibility+

Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.

 

 

Read More →

rpsychologist Data Management+

Kristoffer Magnusson’s blog about R, Statistics, Psychology, Open Science, and Data Visualization.

p-uniform Interdisciplinary+

The p-uniform package provides meta-analysis methods that correct for publication bias. Three methods are currently included in the package. The p-uniform method can be used for estimating effect size, testing the null hypothesis of no effect, and testing for publication bias. The second method in the package is the hybrid method. The hybrid method is a meta-analysis method for combining an original study and replication and while taking into account statistical significance of the original study. The p-uniform and hybrid method are based on the statistical theory that the distribution of p-values is uniform conditional on the population effect size. The third method in the package is the Snapshot Bayesian Hybrid Meta-Analysis Method. This method computes posterior probabilities for four true effect sizes (no, small, medium, and large) based on an original study and replication while taking into account publication bias in the original study. The method can also be used for computing the required sample size of the replication akin to power analysis in null hypothesis significance testing.

 

Read More →

p-curve Dynamic Documents and Coding Practices+

P-curve is a tool for determining if reported effects in literature are true or if they merely reflect selective reporting. P-curve is the distribution of statistically significant p-values for a set of studies (ps < .05). Because only true effects are expected to generate right-skewed p-curves – containing more low (.01s) than high (.04s) significant p-values – only right-skewed p-curves are diagnostic of evidential value. By telling us whether we can rule out selective reporting as the sole explanation for a set of findings, p-curve offers a solution to the age-old inferential problems caused by file-drawers of failed studies and analyses.

Read More →

statcheck Interdisciplinary+

statcheck is an R package that checks for errors in statistical reporting in APA-formatted documents. It can help estimate the prevalence of reporting errors and is a tool to check your own work before submitting. The package can be used to automatically extract statistics from articles and recompute p values. It is also available as a wep app.

Transparent and Open Social Science Research Dynamic Documents and Coding Practices+

Demand is growing for evidence-based policymaking, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.

You can access the course videos for self-paced learning on the BITSS YouTube channel here, (also available with subtitles in French here). You can also enroll for free during curated course runs on the FutureLearn platform.

Read More →

Manual of Best Practices Dynamic Documents and Coding Practices+

Manual of Best Practices, written by Garret Christensen (BITSS), is a working guide to the latest best practices for transparent quantitative social science research. The manual is also available, and occasionally updated on GitHub. For suggestions or feedback, contact garret@berkeley.edu.

Read More →

Curate Science Issues with transparency and reproducibility+

Curate Science is a crowd-sourced platform to track, organize, and interpret replications of published findings in the social sciences. Curated replication study characteristics include links to PDFs, open/public data, open/public materials, pre-registered protocols, independent variables (IVs), outcome variables (DVs), replication type, replication design differences, and links to associated evidence collections that feature meta-analytic forest plots.

Read More →

Promise and Perils of Pre-Analysis Plans Economics and Finance+

Promise and Perils of Pre-analysis Plans, by Ben Olken lays out many of the items to include in a pre-analysis plan, as well as their history, the benefits, and a few potential drawbacks. Pre-analysis plans can be especially useful in reaching agreement about what will be measured and how when a partner or funder has a vested interest in the outcome of a study.

Reshaping Institutions Economics and Finance+

Reshaping Institutions is a paper by Katherine Casey, Rachel Glennerster, and Edward Miguel that uses a pre-analysis plan to analyze the effects of a community driven development program in Sierra Leone. They discuss the contents and benefits of a PAP in detail, and include a “cherry-picking” table that shows the wide flexibility of analysis that is possible without pre-specification. The PAP itself is included in Appendix A in the supplementary materials, available at the link above.

Read More →