Transparent and Open Social Science Research (FR) Dynamic Documents and Coding Practices+

Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course created and administered by BITSS, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.

BITSS training survey templates InterdisciplinaryIssues with transparency and reproducibility

BITSS developed templates for pre- and post-training surveys that can be used by instructors to record learning outcomes in research transparency and reproducibility training events.

The links below enable access as an editor; please make a copy of each form to use it for your own purposes:

PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management and De-identification+

BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.

Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices+

These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.

These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.

Course Syllabi for Open and Reproducible Methods Anthropology, Archaeology, and Ethnography+

A collection of course syllabi from any discipline featuring content to examine or improve open and reproducible research practices. Housed on the OSF.

Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management and De-identification+

Created by the Policy Design and Evaluation Lab (PDEL) at UCSD, this teaching module was developed to (1) demonstrate the credibility crisis in the social sciences caused by a variety of incentives and practices at both the disciplinary and individual levels, and (2) provide practical steps for researchers to improve the credibility of their work throughout the lifecycle of a project. It is intended for use in graduate-level social science methodology courses—including those in political science, economics, sociology, and psychology—at UCSD and beyond.
These materials were developed as part of a BITSS Catalyst Training Project “Creating Pedagogical Materials to Enhance Research Transparency at UCSD” led by Catalysts Scott Desposato and Craig McIntosh along with Julia Clark, PhD candidate at UCSD.

Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices+

The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.

 

Go Fishing App Interdisciplinary+

If you get to choose your tests after you see the data, you can get whatever results you like. To see the logic try out this fishy test.

statcheck Wep App Interdisciplinary+

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.

Retraction Watch Interdisciplinary+

Retraction Watch is a blog that reports on retractions of scientific papers, as a window into the scientific process.

Improving Your Statistical Inference Dynamic Documents and Coding Practices+

This course aims to help you to draw better statistical inferences from empirical research. Students discuss how to correctly interpret p-values, effect sizes, confidence intervals, Bayes Factors, and likelihood ratios, and how these statistics answer different questions you might be interested in. Then, they learn how to design experiments where the false positive rate is controlled, and how to decide upon the sample size for a study, for example in order to achieve high statistical power. Subsequently, students learn how to interpret evidence in the scientific literature given widespread publication bias, for example by learning about p-curve analysis. Finally, the course discusses how to do philosophy of science, theory construction, and cumulative science, including how to perform replication studies, why and how to pre-register an experiment, and how to share results following Open Science principles.

Nicebread Data Management and De-identification+

Dr. Felix Schönbrodt’s blog promoting research transparency and open science.

NeuroChambers Issues with transparency and reproducibility+

Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.

 

 

rpsychologist Data Management and De-identification+

Kristoffer Magnusson’s blog about R, Statistics, Psychology, Open Science, and Data Visualization.

p-uniform Interdisciplinary+

The p-uniform package provides meta-analysis methods that correct for publication bias. Three methods are currently included in the package. The p-uniform method can be used for estimating effect size, testing the null hypothesis of no effect, and testing for publication bias. The second method in the package is the hybrid method. The hybrid method is a meta-analysis method for combining an original study and replication and while taking into account statistical significance of the original study. The p-uniform and hybrid method are based on the statistical theory that the distribution of p-values is uniform conditional on the population effect size. The third method in the package is the Snapshot Bayesian Hybrid Meta-Analysis Method. This method computes posterior probabilities for four true effect sizes (no, small, medium, and large) based on an original study and replication while taking into account publication bias in the original study. The method can also be used for computing the required sample size of the replication akin to power analysis in null hypothesis significance testing.

 

p-curve Dynamic Documents and Coding Practices+

P-curve is a tool for determining if reported effects in literature are true or if they merely reflect selective reporting. P-curve is the distribution of statistically significant p-values for a set of studies (ps < .05). Because only true effects are expected to generate right-skewed p-curves – containing more low (.01s) than high (.04s) significant p-values – only right-skewed p-curves are diagnostic of evidential value. By telling us whether we can rule out selective reporting as the sole explanation for a set of findings, p-curve offers a solution to the age-old inferential problems caused by file-drawers of failed studies and analyses.

statcheck Interdisciplinary+

statcheck is an R package that checks for errors in statistical reporting in APA-formatted documents. It can help estimate the prevalence of reporting errors and is a tool to check your own work before submitting. The package can be used to automatically extract statistics from articles and recompute p values. It is also available as a wep app.

Transparent and Open Social Science Research Dynamic Documents and Coding Practices+

Demand is growing for evidence-based policy making, but there is also growing recognition in the social science community that limited transparency and openness in research have contributed to widespread problems. With this course, you can explore the causes of limited transparency in social science research, as well as tools to make your own work more open and reproducible.

You can enroll in the full course for free and access hands-on and social activities on the FutureLearn platform during designated course runs, or access the course videos for self-paced learning on our website here.

Manual of Best Practices Dynamic Documents and Coding Practices+

Manual of Best Practices, written by Garret Christensen (BITSS), is a working guide to the latest best practices for transparent quantitative social science research. The manual is also available, and occasionally updated on GitHub. For suggestions or feedback, contact garret@berkeley.edu.

Curate Science Issues with transparency and reproducibility+

Curate Science is a crowd-sourced platform to track, organize, and interpret replications of published findings in the social sciences. Curated replication study characteristics include links to PDFs, open/public data, open/public materials, pre-registered protocols, independent variables (IVs), outcome variables (DVs), replication type, replication design differences, and links to associated evidence collections that feature meta-analytic forest plots.

Promise and Perils of Pre-Analysis Plans Economics and Finance+

Promise and Perils of Pre-analysis Plans, by Ben Olken lays out many of the items to include in a pre-analysis plan, as well as their history, the benefits, and a few potential drawbacks. Pre-analysis plans can be especially useful in reaching agreement about what will be measured and how when a partner or funder has a vested interest in the outcome of a study.

Reshaping Institutions Economics and Finance+

Reshaping Institutions is a paper by Katherine Casey, Rachel Glennerster, and Edward Miguel that uses a pre-analysis plan to analyze the effects of a community driven development program in Sierra Leone. They discuss the contents and benefits of a PAP in detail, and include a “cherry-picking” table that shows the wide flexibility of analysis that is possible without pre-specification. The PAP itself is included in Appendix A in the supplementary materials, available at the link above.