Welcome to the BITSS Blog

The BITSS blog is an interdisciplinary venue for discussing issues surrounding meta-science including transparency, reproducibility, methodology, ethics, and access. Posts are authored by BITSS staff, as well as affiliated partners and those broadly interested in open science and meta-science.

If you are interested in submitting to the BITSS Blog, please see Guidance for Authors.


Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
  • New Standards for Research Reporting in Psychology

    Psychological Science, the flagship journal of the Association for Psychological Science (APS), is introducing innovative new guidelines for authors, part of an effort to strengthen the reporting and analysis of findings in psychological research. Starting January 1, 2014, submitting authors will be required to state that they have disclosed all important methodological details,…


  • Research Transparency Landscape

    A landscape of funder data access policies and other resources, by Stephanie Wykstra. New technology makes sharing research outputs– not just publications but also raw data, code, software, even lab notebooks – easier than ever before. The benefits from more open science are widely acknowledged. Yet there is still room for improvement:…


  • Setting up a replication workshop: Typical Challenges


  • The Folly of Powering Replications Based on Observed Effect Size

    Uri Simonsohn on replications: It is common for researchers running replications to set their sample size assuming the effect size the original researchers got is correct. So if the original study found an effect-size of d=.73, the replicator assumes the true effect is d=.73, and sets sample size so as to have…


  • Too Much Trusting, Not Enough Verifying

    This week in The Economist: Too many of the findings that fill the academic ether are the result of shoddy experiments or poor analysis […] One reason is the competitiveness of science […] The obligation to “publish or perish” has come to rule over academic life. Competition for jobs is cut-throat […] Nowadays verification (the…


  • Trying out the new Trial Registries

    Reblogged from World Bank’s David McKenzie: Both the American Economic Association and 3ie have launched Impact Evaluation Trial Registries […] I recently tried out both registries by registering a couple of studies I have underway, so thought I’d share some feedback on the process for those of you wondering whether/how to register. Read…


  • Changes in the Research Process Must Come From the Scientific Community

    In a recent article intended to be published in a major policy journal, Victoria Stodden urges the scientific community to take the lead in establishing a new framework for more transparent research practices. While recent policy changes by the US government regarding public access to data and publications from federally funded research can…


  • Let’s Go Fishing

    An interesting piece on p-fishing and what we can do about it.


  • Let's Go Fishing

    An interesting piece on p-fishing and what we can do about it.


  • The Imperative to Share Complete Replication Files

    “Good research involves publishing complete replication files, making every step of research as explicit and reproducible as is practical.” This is the conclusion from a new paper by political scientist Allan Dafoe (Yale University). Dafoe examines the availability of replication data in political science journals, and concludes that “for the majority of published statistical analyses, […]…


  • New Registry for Impact Evaluations in International Development

    The 3ie Registry for International Development Impact Evaluations (RIDIE) is a registry of impact evaluations related to development in low and middle income countries. The purpose of the registry is to enhance the transparency and quality of evaluation research as well as to provide a repository of impact evaluation studies for researchers, funders, and…


  • Bias Minimization Lessons from Medicine – How We Are Leaving a $100 Bill on the Ground

    By Alex Eble (Brown University), Peter Boone (Effective Intervention), and Diana Elbourne (University of London) The randomized controlled trial (RCT) now has pride of place in much applied work in economics and other social sciences. Economists increasingly use the RCT as a primary method of investigation, and aid agencies such as the World…


  • AEA RCT Registry Webinar This Friday

    The American Economic Association’s RCT Registry is a registration tool for pre-analysis plans of Randomized Controlled Trials in economics and other social sciences. The Abdul Latif Jameel Poverty Action Lab (J-PAL) will be hosting a brown bag webcast this Friday, September 20th at 1pm (EDT) to go over the motivations behind the registry and…


  • The Role of Failure in Promoting Transparency

    By Carson Christiano (CEGA) You may wonder why a network of development researchers is taking the lead on a transparency initiative. The answer lies in the profound and omnipresent power of failure. Most would agree that risk-taking is essential to innovation, whether we’re talking about creating a simple hand-washing station or a…


  • Research Transparency in the Natural Sciences: What can we learn?

    By Temina Madon (CEGA, UC Berkeley) As we all know, experimentation in the natural sciences far predates the use of randomized, controlled trials (RCTs) in medicine and the social sciences; some of the earliest controlled experiments were conducted in the 1920s by RA Fisher, an agricultural scientist evaluating new crop varieties across…


  • Transparency-Inducing Institutions and Legitimacy

    By Kevin M. Esterling (Political Science, UC Riverside) Whenever I discuss the idea of hypothesis preregistration with colleagues in political science and in psychology, the reactions I get typically range from resistance to outright hostility. These colleagues obviously understand the limitations of research founded on false-positives and data over-fitting. They are even…


  • The Need for Pre-Analysis: First Things First

    By Richard Sedlmayr (Philanthropic Advisor) When we picture a desperate student running endless tests on his dataset until some feeble point finally meets statistical reporting conventions, we are quick to dismiss the results. But the underlying issue is ubiquitous: it is hard to analyze data without getting caught in a hypothesis drift,…


  • Freedom! Pre-Analysis Plans and Complex Analysis

    By Gabriel Lenz (UC Berkeley) Like many researchers, I worry constantly about whether findings are true or merely the result of a process variously called data mining, fishing, capitalizing on chance, or p-hacking. Since academics face extraordinary incentives to produce novel results, many suspect that “torturing the data until it speaks” is…


  • Transparency and Pre-Analysis Plans: Lessons from Public Health

    By David Laitin (Political Science, Stanford) My claim in this blog entry is that political science will remain principally an observation-based discipline and that our core principles of establishing findings as significant should consequently be based upon best practices in observational research. This is not to deny that there is an expanding…


  • Targeted Learning from Data: Valid Statistical Inference Using Data Adaptive Methods

    By Maya Petersen, Alan Hubbard, and Mark van der Laan (Public Health, UC Berkeley) Statistics provide a powerful tool for learning about the world, in part because they allow us to quantify uncertainty and control how often we falsely reject null hypotheses. Pre-specified study designs, including analysis plans, ensure that we understand…


  • Monkey Business

    By Macartan Humphreys (Political Science, Columbia & EGAP) I am sold on the idea of research registration. Two things convinced me. First I have been teaching courses in which each week we try to replicate prominent results produced by political scientists and economists working on the political economy of development. I advise…


  • Bayes’ Rule and the Paradox of Pre-Registration of RCTs

    By Donald P. Green (Political Science, Columbia) Not long ago, I attended a talk at which the presenter described the results of a large, well-crafted experiment. His results indicated that the average treatment effect was close to zero, with a small standard error. Later in the talk, however, the speaker revealed that…


  • An Open Discussion on Promoting Transparency in Social Science Research

    By Edward Miguel (Economics, UC Berkeley) This CEGA Blog Forum builds on a seminal research meeting held at the University of California, Berkeley on December 7, 2012. The goal was to bring together a select interdisciplinary group of scholars – from biostatistics, economics, political science and psychology – with a shared interest…