Course materials: PhD Toolkit on Transparent, Open, and Reproducible Research Economics and Finance
Catalyst Ada Gonzalez-Torres developed and delivered a PhD course on Transparent, Open, and Reproducible Research for PhD students at the European University Institute (EUI), in Florence, Italy. Find all course materials here.
PhD Course Materials: Transparent, Open, and Reproducible Policy Research Data Management and De-identification
BITSS Catalyst Sean Grant developed and delivered a PhD course on Transparent, Open, and Reproducible Policy Research at the Pardee RAND Graduate School in Policy Analysis. Find all course materials at the project’s OSF page.
Transparency Training Module for Undergraduate Experimental Economics Dynamic Documents and Coding Practices
These materials were used in the final weeks of an undergraduate course experimental economics at Wesleyan University taught by Professor Jeffrey Naecker.
These materials were developed as part of a BITSS Catalyst Training Project “Incorporating Reproducibility and Transparency in an Undergraduate Economics Course” led by Catalyst Jeffrey Naecker.
Improving the Credibility of Social Science Research: A Practical Guide for Researchers Data Management and De-identification
Royal Society Open Science Registered Reports Health Sciences
The Royal Society Open Science is a fast, open journal publishing high quality research across all of science, engineering and mathematics. A Registered Report (RR) is a form of journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted (stage 1). High quality protocols are then provisionally accepted for publication before data collection commences. The format is open to attempts of replication as well as novel studies. Once the study is completed, the author will finish the article including results and discussion sections (stage 2). This will be appraised by the reviewers, and provided necessary conditions are met, will be published.
Accountable Replications Policy “Pottery Barn” Dynamic Documents and Coding Practices
The Accountable Replication Policy commits the Psychology and Cognitive Neuroscience section of Royal Society Open Science to publishing replications of studies previously published within the journal. Authors can either submit a replication study that is already completed or a proposal to replicate a previous study. To ensure that the review process is unbiased by the results, submissions will be reviewed with existing results initially redacted (where applicable), or in the case of study proposals, before the results exist. Submissions that report close, clear and valid replications of the original methodology will be offered in principle acceptance, which virtually guarantees publication of the replication regardless of the study outcome.
statcheck Wep App Interdisciplinary
statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.
Jupyter Notebooks Data Visualization
The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and explanatory text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, machine learning and much more.
Docker Data Visualization
Docker is the world’s leading software container platform. Developers use Docker to eliminate “works on my machine” problems when collaborating on code with co-workers. Operators use Docker to run and manage apps side-by-side in isolated containers to get better compute density. Enterprises use Docker to build agile software delivery pipelines to ship new features faster, more securely and with confidence for both Linux and Windows Server apps.
NeuroChambers Issues with transparency and reproducibility
Chris Chambers is a psychologist and neuroscientist at the School of Psychology, Cardiff University. He created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012. The aim of his blog is give you some insights from the trenches of science. He talks about a range of science-related issues and may even give up a trade secret or two.
The New Statistics (+OSF Learning Page) Data Management and De-identification
This OSF project helps organize resources for teaching the “New Statistics”–an approach that emphasizes asking quantitative questions, focusing on effect sizes, using confidence intervals to express uncertainty about effect sizes, using modern data visualizations, seeking replication, and using meta-analysis as a matter of course (Cumming, 2011).
statcheck is an R package that checks for errors in statistical reporting in APA-formatted documents. It can help estimate the prevalence of reporting errors and is a tool to check your own work before submitting. The package can be used to automatically extract statistics from articles and recompute p values. It is also available as a wep app.
re3data.org Data Repositories
The Registry of Research Data Repositories (re3data.org) is a global registry of research data repositories that covers research data repositories from different academic disciplines. It presents repositories for the permanent storage and access of data sets to researchers, funding bodies, publishers and scholarly institutions. re3data.org promotes a culture of sharing, increased access and better visibility of research data. The registry went live in autumn 2012 and is funded by the German Research Foundation (DFG).
Curate Science Issues with transparency and reproducibility
Curate Science is a crowd-sourced platform to track, organize, and interpret replications of published findings in the social sciences. Curated replication study characteristics include links to PDFs, open/public data, open/public materials, pre-registered protocols, independent variables (IVs), outcome variables (DVs), replication type, replication design differences, and links to associated evidence collections that feature meta-analytic forest plots.
Edawax conducts meta-research on a variety of topics related to research practices – including an analysis of the data sharing policies of peer-reviewed journals – with the hope of 1) gaining the insight to identify the obstacles to performing replications and 2) using those insights to develop resources and infrastructure to facilitate replications and meta-analysis.
Experimental Lab Standard Operating Procedures Data Management and De-identification
This standard operating procedure (SOP) document describes the default practices of the experimental research group led by Donald P. Green at Columbia University. These defaults apply to analytic decisions that have not been made explicit in pre-analysis plans (PAPs). They are not meant to override decisions that are laid out in PAPs. The contents of our lab’s SOP available for public use. We welcome others to copy or adapt it to suit their research purposes.