Resource Library

The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.

Know of a great resource that we haven’t included or have questions about the existing resources? Email us!

  • Topic

  • Type

  • Discipline

30 Results

Framework for Open and Reproducible Research Training (FORRT) Data Management+

FORRT is a pedagogical infrastructure designed to recognize and support the teaching and mentoring of open and reproducible science tenets in tandem with prototypical subject matters in higher education. FORRT also advocates for the opening of teaching and mentoring materials as a means to facilitate access, discovery, and learning to those who otherwise would be educationally disenfranchised.  
Read More →

Perspectives from Biomedical Research Health Sciences+

Find slides from a presentation by Maya Petersen titled “Pre-Registration, Pre-analysis, and Transparent Reporting: Perspectives from biomedical research”.

Data Citations module Data Management+

Created by the Federal Reserve Bank of St. Louis, this module introduces students to the key elements of data citations. See also related modules for Data Literacy.

CRediT (Contributor Roles Taxonomy) InterdisciplinaryTransparent Reporting

CRediT (Contributor Roles Taxonomy) is high-level taxonomy, including 14 roles, that can be used to represent the roles typically played by contributors to scientific scholarly output. The roles describe each contributor’s specific contribution to the scholarly output.

Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Educational Expansion Epidemiology+

Created by Catalyst Melissa Sharp, this is an open-source repository for epidemiological research methods and reporting skills for observational studies, structured based on the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement. Use it to discover new methods and reporting guidelines and contribute through the GitHub repository (https://github.com/sharpmel/STROBECourse/).

Read More →

Videos: Research Transparency and Reproducibility Training (RT2) – Washington, D.C. Data Management+

BITSS hosted a Research Transparency and Reproducibility Training (RT2) in Washington DC, September 11-13, 2019. This was the eighth training event of this kind organized by BITSS since 2014.

RT2 provides participants with an overview of tools and best practices for transparent and reproducible social science research. Click here to videos of presentations given during the training. Find slide decks and other useful materials on this OSF project page (https://osf.io/3mxrw/).

Read More →

Stage 1 Registered Report Submission Template Economics and Finance+

BITSS prepared a template to assist authors in the preparation of their Stage 1 Proposal submissions to the Journal of Development Economics. The template expands on features that are commonly reported in pre-analysis plans in development economics, and includes a checklist to help authors record different parts of the research design.

NRIN Collection of Resources on Research Integrity Data Management+

Curated by the Netherlands Research Integrity Network (NRIN), this collection contains literature, tools, guidelines, and educational media related to research Integrity. Access the Collection here.

PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management+

BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.

Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+

A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.

TOP Guidelines InterdisciplinaryTransparent Reporting

Transparency and Openness Promotion (TOP) Guidelines are a set eight modular transparency standards for academic journals, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to adopt for their journal, and select a level of implementation for the selected standards. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

Read More →

EQUATOR Network Health Sciences+

The EQUATOR Network is an organization that tracks reporting guidelines across numerous types of studies. They currently have suggestions for over 275 guidelines depending on what type of research you are engaged in.

CONSORT Statement Health Sciences+

The 2010 CONSORT Statement is a widely adopted set of recommendations for randomized trial reporting. It includes a concise reporting checklist for researchers to follow, and has been published in the British Medical Journal, the Lancet, and PLoS Medicine.

PRISMA Interdisciplinary+

PRISMA is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA focuses on the reporting of reviews evaluating randomized trials, but can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions.

 

STROBE Statement Health Sciences+

The STROBE Statement is a reporting guideline written for observational studies in epidemiology. It incorporates a checklist of 22 items considered essential for observational study reporting.

statcheck Wep App Interdisciplinary+

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.

Nicebread Data Management+

Dr. Felix Schönbrodt’s blog promoting research transparency and open science.

NeuroChambers Issues with transparency and reproducibility+

Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.

 

 

Read More →

The New Statistics (+OSF Learning Page) Data Management+

This OSF project helps organize resources for teaching the “New Statistics” — an approach that emphasizes asking quantitative questions, focusing on effect sizes, using confidence intervals to express uncertainty about effect sizes, using modern data visualizations, seeking replication, and using meta-analysis as a matter of course.

 

statcheck Interdisciplinary+

statcheck is an R package that checks for errors in statistical reporting in APA-formatted documents. It can help estimate the prevalence of reporting errors and is a tool to check your own work before submitting. The package can be used to automatically extract statistics from articles and recompute p values. It is also available as a wep app.

Transparent and Open Social Science Research Dynamic Documents and Coding Practices+

Demand is growing for evidence-based policymaking, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.

You can access the course videos for self-paced learning on the BITSS YouTube channel here, (also available with subtitles in French here). You can also enroll for free during curated course runs on the FutureLearn platform.

Read More →

Manual of Best Practices Dynamic Documents and Coding Practices+

Manual of Best Practices, written by Garret Christensen (BITSS), is a working guide to the latest best practices for transparent quantitative social science research. The manual is also available, and occasionally updated on GitHub. For suggestions or feedback, contact garret@berkeley.edu.

Read More →

Implementing Reproducible Research Dynamic Documents and Coding Practices+

Implementing Reproducible Research by Victoria Stodden, Friedrich Leisch, and Roger D. Peng covers many of the elements necessary for conducting and distributing reproducible research. The book focuses on the tools, practices, and dissemination platforms for ensuring reproducibility in computational science.

Pre-Analysis Plan Template Economics and Finance+

Pre-analysis Plan Template, by Alejandro Ganimian, is useful for instructors when teaching transparency methods, and for researchers themselves when developing their own pre-analysis plan.

Find a .doc version of this template here. Find a .tex version here.

Read More →

Pre-Analysis Plan Checklist Economics and Finance+

Pre-analysis Plan Checklist, by David McKenzie, Lead Economist at the World Bank Development Research Group.

Experimental Lab Standard Operating Procedures Data Management+

This standard operating procedure (SOP) document describes the default practices of the experimental research group led by Donald P. Green at Columbia University. These defaults apply to analytic decisions that have not been made explicit in pre-analysis plans (PAPs). They are not meant to override decisions that are laid out in PAPs. The contents of our lab’s SOP available for public use. We welcome others to copy or adapt it to suit their research purposes.

Read More →

Standardized Disclosure Peer Review PsychologyTransparent Reporting

A standard statement developed for peer review in psychology.

“I request that the authors add a statement to the paper confirming whether, for all experiments, they have reported all measures, conditions, data exclusions, and how they determined their sample sizes. The authors should, of course, add any additional text to ensure the statement is accurate. This is the standard reviewer disclosure request endorsed by the Center for Open Science [see http://osf.io/project/hadz3]. I include it in every review.”

Blog discussion of the statement: here and here.

Read More →

Zotero InterdisciplinaryTransparent Reporting

Zotero is the only research tool that automatically senses content in your web browser, allowing you to add it to your personal library with a single click. Whether you’re searching for a preprint on arXiv.org, a journal article from JSTOR, a news story from the New York Times, or a book from your university library catalog, Zotero has you covered with support for thousands of sites.