Resource Library

The BITSS Resource Library contains resources for learning, teaching, and practicing research transparency and reproducibility, including curricula, slide decks, books, guidelines, templates, software, and other tools. All resources are categorized by i) topic, ii) type, and iii) discipline. Filter results by applying criteria along these parameters or use the search bar to find what you’re looking for.

Know of a great resource that we haven’t included or have questions about the existing resources? Email us!

  • Topic

  • Type

  • Discipline

103 Results

Meta Life Sciences+

Meta is an interactive search tool that indexes research in the biomedical sciences. Users can create personalized news feeds by selecting from a range of concepts, journals, preprints, or papers of interest.

Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices+

Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.

Software Carpentry Data Management and De-identification+

Software Carpentry offers online tutorials for data analysis including Version Control with Git, Using Databases and SQL, Programming with Python, Programming with R and Programming with MATLAB.

BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility

BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.

The links below enable access as an editor; please make a copy of each form to use it for your own purposes:

Web Plot Digitizer Data Management and De-identification+

App extracts data from charts

ResonsibleData.io Data Management and De-identification+

Using data for social change work offers many opportunities, but it brings challenges, too. The RD community develops practical ways to deal with the unintended consequences of using data in social change work, establishes best practices, and shares approaches between leading thinkers and doers from different sectors. We discuss thorny topics in-person, facilitate online group discussions on the RD mailing list, and share resources on this site.

Observational PAP Guide Economics and Finance+

In her preprint titled “Improving transparency in observational social science research: A pre-analysis plan approach”, Fiona Burlig (University of Chicago) presents three scenarios in which study preregistration and pre-analysis plans (PAPs) can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data. The preprint also includes suggested contents for observational PAPs, and highlights where observational PAPs should deviate from those designed for experimental research.

This work was also published in the journal Economics Letters.

Registry for International Development Impact Evaluations (RIDIE) Economics and Finance+

Administered by the International Initiative for Impact Evaluation (3ie), the Registry for International Development Impact Evaluations (RIDIE) is a registry of impact evaluations related to development in low and middle income countries. RIDIE will register any development impact evaluation that rigorously attempts to estimate the causal impacts of a program, including but not limited to randomized control trials. It is intended to be a prospective registry in which researchers and evaluators can record information about their evaluation designs before conducting the analysis, as well as update information as the study proceeds and post findings upon study completion.

Catalog of open source licenses InterdisciplinaryOpen Publishing

Using this online tool, you can choose an open source license to clearly articulate the conditions under which others can use, distribute, modify or contribute to your software and non-software projects.

Conda Data Visualization+

Conda is an open source package management system and environment management system that runs on Windows, macOS and Linux. Conda installs, runs and updates packages and their dependencies and is operable in multiple languages, including Python, R, Ruby, Lua, Scala, Java, JavaScript, C/ C++, FORTRAN.

Stage 1 Registered Report Submission Template Economics and Finance+

BITSS prepared a template to assist authors in the preparation of their Stage 1 Proposal submissions to the Journal of Development Economics. The template expands on features that are commonly reported in pre-analysis plans in development economics, and includes a checklist to help authors record different parts of the research design.

Whole Tale Data Management and De-identification+

Whole Tale is an infrastructure that allows users to share data, methods and analysis protocols, and final research outputs in a single, executable object (“living publication” or “tale”) alongside any research publication. Learn more here.

NRIN Collection of Resources on Research Integrity Data Management and De-identification+

Curated by the Netherlands Research Integrity Network (NRIN), this collection contains literature, tools, guidelines, and educational media related to research Integrity. Access the Collection here.

PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management and De-identification+

BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.

COS Registered Reports information portal Interdisciplinary+

The Center for Open Science (COS) has put together a portal containing information about the registered reports format of peer review and publication. The portal includes general information about registered reports, a list of journals that have implemented the format, an explanation of an appropriate workflow, resources for journal editors, motivation for funders, FAQs, and a list of allied initiatives, inlcuding those that focus on results-blind review and Exploratory Reports.

Mapping the Universe of Registered Reports Interdisciplinary+

A preprint by Tom Hardwicke and John Ioannidis. Abstract: Selection pressures for significant results may infuse bias into the research process. We evaluated the implementation of one innovation designed to mitigate this bias, ‘Registered Reports’, where study protocols are peer-reviewed and granted in-principle acceptance (IPA) for publication before the study has been conducted. As of February 2018, 91 journals had adopted Registered Reports and 91 Final Reports had been published. Psychology journals are the principal adopters, but expansion has begun into medicine, social science, and other fields. Among 29 journals that responded to a survey, 334 protocols had been submitted to them, 87 had been granted IPA and 32 Final Reports had been published or were in press as of July 2017. We encountered several sub-optimal implementation practices, including non-availability of IPA protocols, and diverse approaches to protocol registration in the absence of a single central registry. Registered Reports should be iteratively evaluated and improved to ensure maximal benefits.

Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+

A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.

MetaArXiv InterdisciplinaryOpen Publishing

An interdisciplinary archive of articles focused on improving research transparency and reproducibility.

DMPTool Data Management and De-identification+

The DMPTool is a free service developed by California Digital Library (CDL) and DataONe that helps researchers and institutions to create high-quality data management plans that meet funder requirements.

AsPredicted.org Interdisciplinary+

AsPredicted.org is “a standardized pre-registration that requires only what’s necessary to separate exploratory from confirmatory analyses.” You can easily generate a short  and simple pre-registration document that “takes less effort to evaluate than it takes to evaluate the published study itself.” The form, designed by Uri Simonsohn, Joe Simmons, and Leif Nelson, has only nine questions, which are general enough that they are relevant to nearly all disciplines and types of research.

AEA RCT Economics and Finance+

The American Economic Association (AEA) Randomized Controlled Trials Registry is a registry for RCTs conducted in the social sciences. Because existing registries were not well suited to the need for social sciences, in April 2012, the AEA executive committee decided to establish such a registry for economics and other social sciences.

rOpenSci Packages Data Management and De-identification+

These packages are carefully vetted, staff- and community-contributed R software tools that lower barriers to working with scientific data sources and data that support research applications on the web.

Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management and De-identification+

Created by the Policy Design and Evaluation Lab (PDEL) at UCSD, this teaching module was developed to (1) demonstrate the credibility crisis in the social sciences caused by a variety of incentives and practices at both the disciplinary and individual levels, and (2) provide practical steps for researchers to improve the credibility of their work throughout the lifecycle of a project. It is intended for use in graduate-level social science methodology courses—including those in political science, economics, sociology, and psychology—at UCSD and beyond.
These materials were developed as part of a BITSS Catalyst Training Project “Creating Pedagogical Materials to Enhance Research Transparency at UCSD” led by Catalysts Scott Desposato and Craig McIntosh along with Julia Clark, PhD candidate at UCSD.

TOP Guidelines InterdisciplinaryTransparent Reporting

Transparency and Openness Promotion (TOP) Guidelines are a set eight modular transparency standards for academic journals, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to adopt for their journal, and select a level of implementation for the selected standards. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

EQUATOR Network Health Sciences+

The EQUATOR Network is an organization that tracks reporting guidelines across numerous types of studies. They currently have suggestions for over 275 guidelines depending on what type of research you are engaged in.

CONSORT Statement Health Sciences+

The 2010 CONSORT Statement is a widely adopted set of recommendations for randomized trial reporting. It includes a concise reporting checklist for researchers to follow, and has been published in the British Medical Journal, the Lancet, and PLoS Medicine.

PRISMA Interdisciplinary+

PRISMA is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA focuses on the reporting of reviews evaluating randomized trials, but can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions.

 

STROBE Statement Health Sciences+

The STROBE Statement is a reporting guideline written for observational studies in epidemiology. It incorporates a checklist of 22 items considered essential for observational study reporting.

SPARC (Scholarly Publishing and Academic Resources Coalition) Data Management and De-identification+

This community resource for tracking, comparing, and understanding both current and future U.S. federal funder research data sharing policies is a joint project of SPARC & Johns Hopkins University Libraries.

Royal Society Open Science Registered Reports Health Sciences+

The Royal Society Open Science is a fast, open journal publishing high quality research across all of science, engineering and mathematics. A Registered Report (RR) is a form of journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted (stage 1). High quality protocols are then provisionally accepted for publication before data collection commences. The format is open to attempts of replication as well as novel studies. Once the study is completed, the author will finish the article including results and discussion sections (stage 2). This will be appraised by the reviewers, and provided necessary conditions are met, will be published.

Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices+

The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.

 

Go Fishing App Interdisciplinary+

If you get to choose your tests after you see the data, you can get whatever results you like. To see the logic try out this fishy test.

statcheck Wep App Interdisciplinary+

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.

Gates Open Research Health Sciences+

Gates Open Research is a scholarly publishing platform that makes research funded by the Bill & Melinda Gates Foundation available quickly and in a format supporting research integrity, reproducibility and transparency. Its open access model enables immediate publication followed by open, invited peer review, combined with an open data policy.

Retraction Watch Interdisciplinary+

Retraction Watch is a blog that reports on retractions of scientific papers, as a window into the scientific process.

Impact Evaluation in Practice Data Management and De-identification+

The second edition of the Impact Evaluation in Practice handbook is a comprehensive and accessible introduction to impact evaluation for policymakers and development practitioners. First published in 2011, it has been used widely across the development and academic communities. The book incorporates real-world examples to present practical guidelines for designing and implementing impact evaluations. Readers will gain an understanding of impact evaluation and the best ways to use impact evaluations to design evidence-based policies and programs. The updated version covers the newest techniques for evaluating programs and includes state-of-the-art implementation advice, as well as an expanded set of examples and case studies that draw on recent development challenges. It also includes new material on research ethics and partnerships to conduct impact evaluation.

Improving Your Statistical Inference Dynamic Documents and Coding Practices+

This course aims to help you to draw better statistical inferences from empirical research. Students discuss how to correctly interpret p-values, effect sizes, confidence intervals, Bayes Factors, and likelihood ratios, and how these statistics answer different questions you might be interested in. Then, they learn how to design experiments where the false positive rate is controlled, and how to decide upon the sample size for a study, for example in order to achieve high statistical power. Subsequently, students learn how to interpret evidence in the scientific literature given widespread publication bias, for example by learning about p-curve analysis. Finally, the course discusses how to do philosophy of science, theory construction, and cumulative science, including how to perform replication studies, why and how to pre-register an experiment, and how to share results following Open Science principles.


Page 1 of 3