What We Can Learn through Replication in Qualitative Research

An interview with Megan Becker, University of Southern California By Aleksandar Bogdanoski, BITSS Replication underscores the importance of transparency in methods and data in research and is critical in ensuring that science is self-correcting. Though efforts for improved replicability led by many in the open science community, including BITSS, have focused mainly…

Internal replication: another tool for the reproducibility toolkit

By Jade Benjamin-Chung (University of California, Berkeley) and Benjamin F. Arnold (University of California, San Francisco) Introduction from BITSS: Internal replication is a new tool in the reproducibility toolkit with which original study investigators replicate findings prior to submission to a peer-reviewed journal. Jade Benjamin-Chung (UC Berkeley) and Benjamin Arnold (UCSF) describe…

Kicking off a new partnership with The Choice Lab at the 68° North Conference

Jennifer Sturdy–BITSS Program Advisor Science, my lad, is made up of mistakes, but they are mistakes which it is useful to make, because they lead little by little to the truth. It seems fitting that I read a book which references this Jules Verne quote just before our BITSS panel on research…

Replication Project: Economics–sort of

Garret Christensen–BITSS Project Scientist I say sort of because it wasn’t run by the Center for Open Science, but in a similar spirit to the Replication Project: Psychology, Colin Camerer led a big reproducibility project related to economics experiments that came out in Science today.  Here’s the related news article, and here’s…

BITSS Sessions Around the World

Garret Christensen–BITSS Project Scientist I’ve recently had the opportunity to represent BITSS at a few interesting meetings and conferences that you might be interested to hear about. A group of political scientists and other social scientists met at Stanford and held a daylong workshop to discuss steps the discipline could take to…

Replication and Transparency Workshop Jan 6-7, 2016

Garret Christensen–BITSS Project Scientist BITSS is happy to announce a workshop on replication and transparency coming up in January, 2016 right after the AEA Annual Meeting in San Francisco, CA. Before I get to that, a reminder about our workshops in November and December. In November, Nicole Janz (of Cambridge University and…

Links: Blind Analysis and Pre-Analysis Plans, Replication Failure

Garret Christensen–BITSS Project Scientist   There’s an interesting new proposal to deal with the problems of bias, p-hacking, and reproducibility failures from Saul Perlmutter, Nobel laureate UC Berkeley physicist and director of the Berkeley Institute for Data Science (where I’m a fellow) called blind analysis. Perlmutter and Robert MacCoun of Stanford have…

Replication in Economics

Garret Christensen–BITSS Project Scientist   CEGA faculty director Ted Miguel was quoted in a Wall Street Journal blog post by Anna Louie Sussman today: “At this point, everybody doing with [sic] work with data and economics has an expectation that their data is very likely to get posted online, that someone is…

The BITSS Take on "wormwars" and Replication Writ Large

Garret Christensen–BITSS Project Scientist If you’re a development economist, or at all interested in research transparency, I assume you’ve heard about the recent deworming replication controversy. (If you were lucky enough to miss “wormwars,” you can catch up with just about every thing with this one set of links on storify.com). Here…

Emerging Researcher Perspectives: Replication as a Credible Pre-Analysis Plan

One of the most important tools for enhancing the credibility of research is the pre-analysis plan, or the PAP. Simply put, we feel more confident in someone’s inferences if we can verify that they weren’t data mining, engaging in motivated reasoning, or otherwise manipulating their results, knowingly or unknowingly. By publishing a…

Advisory Board Established for Project TIER

Guest post by Richard Ball and Norm Medeiros, co-principal investigators of Project TIER at Haverford College. Project TIER (Teaching Integrity in Empirical Economics) is pleased to announce its newly-established Advisory Board. The advisors – George Alter (ICPSR), J. Scott Long (Indiana University), Victoria Stodden (University of Illinois at Urbana-Champaign), and Justin Wolfers (Peterson Institute/University of Michigan) – will…

Influential Paper on Gay Marriage Might Be Marred by Fraudulent Data

Harsh scrutiny of an influential political science experiment highlights the importance of transparency in research. The paper, from UCLA graduate student Michael LaCour and Columbia University Professor Donald Green, was published in Science in December 2014. It asserted that short conversations with gay canvassers could not only change people’s minds on a divisive social issue like same-sex…

Three Transparency Working Papers You Need to Read

Garret Christensen, BITSS Project Scientist Several great working papers on transparency and replication in economics have been released in the last few months. Two of them are intended for a symposium in The Journal of Economic Perspectives, to which I am very much looking forward, and are about pre-analysis plans. The first of…

Registered Reports to the Rescue?

After writing an article for The Upshot, Brendan Nyhan (Assistant Professor at Dartmouth) was interviewed by The Washington Post. The original Upshot article advocates for a new publishing structure called Registered Reports (RRs): A research publishing format in which protocols and analysis plans are peer reviewed and registered prior to data collection, then published regardless of the outcome. In the following interview…

This Monday at AEA2015: Transparency and Integrity in Economic Research Panel

This January 5th, 10.15am at the American Economic Association Annual Meeting in Boston, MA (Sheraton Hotel, Commonwealth Room). Session: Promoting New Norms for Transparency and Integrity in Economic Research Presiding: Edward Miguel (UC Berkeley) Panelists: Brian Nosek (University of Virginia): “Scientific Utopia: Improving Openness and Reproducibility in Scientific Research” Richard Ball (Haverford College): “Replicability…

Scientists Have a Sharing Problem

Dec 15th Maggie Puniewska posted an article in the Atlantic Magazine summarizing the obstacles preventing researchers from sharing their data. The article asks if “science has traditionally been a field that prizes collaboration […] then why [are] so many scientists stingy with their information.” Puniewska outlines the most cited reasons scientists reframe…

Reflections on Two Years Promoting Transparency in Research

By Guillaume Kroll (CEGA) Two years ago, in December 2012, a handful of researchers convened in Berkeley to discuss emerging strategies to increase openness and transparency in social science research. The group’s concerns followed a number of high-level cases of scientific misconduct and unethical practices, particularly in psychology (1,2). As researchers started to question the…

Former BITSS Institute Participant Advocates for Replication in Brazil

Dalson Britto Figueiredo Filho, Adjunct Professor of Political Science at the Federal University of Pernambuco in Recife, Brazil, who attended the BITSS Summer Institute in June 2014, recently published a paper on the importance of replications in Revista Política Hoje. “The BITSS experience really changed my mind on how to do good science”, said Figueiredo Filho.…

Creating Standards for Reproducible Research: Overview of COS Meeting

By Garret Christensen (BITSS) Representatives from BITSS (CEGA Faculty Director Ted Miguel, CEGA Executive Director Temina Madon, and BITSS Assistant Project Scientist Garret Christensen–that’s me) spent Monday and Tuesday of this week at a very interesting workshop at the Center for Open Science aimed at creating standards for promoting reproducible research in the social-behavioral…

Scientific consensus has gotten a bad reputation—and it doesn’t deserve it

In a recent post, Senior science editor at Ars Technica, John Timmer defends the importance of consensus. Opening with the following quote from author Michael Crichton: Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator…

The 10 Things Every Grad Student Should Do

In a recent post on the Data Pub blog, Carly Strasser provides a useful transparency guide for newcomers to the world of empirical research. Below is an adapted version of that post.  1. Learn to code in some language. Any language. Strasser begins her list urging students to learn a programming language. As the limitations of…

Teaching Integrity in Empirical Research

Richard Ball (Economics Professor at Haverford College and presenter at the 2014 BITSS Summer Institute) and Norm Medeiros (Associate Librarian at Haverford College) in a recent interview appearing on  the Library of Congress based blog The Signal, discussed Project TIER (Teaching Integrity in Empirical Research) and their experience educating students how to…

Reproducible Research: True or False?

Keynote speaker at the upcoming BITSS annual meeting John Ioannidis (Professor of Health Research and Policy at Stanford School of Medicine, and Co-Director of the Meta-Research Innovation Center) speaks at Google about its efforts to improve research designs standards and reproducibility in science. Ioannidis is the author of the 2005 highly influential paper Why Most Published Research Findings Are False,…

MCC's First Open Data Challenge

The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations. Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley…

MCC’s First Open Data Challenge

The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations. Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley…

White House Calls for Comments on Reproducible Research

The White House’s Office of Science and Technology Policy (OSTP) has released a request for information on improving the reproducibility of federally funded scientific research. Given recent evidence of the irreproducibility of a surprising number of published scientific findings, how can the Federal Government leverage its role as a significant funder of scientific research…

Political Scientists Launch New Replication Initiative

Following a groundswell of interest for replications in the political sciences, first noticed from survey results posted on the Monkey Cage Blog, Political Scientists Seth Werfel (Stanford University) and Nicole Janz (Cambridge University), and research consultant Stephanie Wykstra launched the Political Science Replication Initiative, a new repository for uploading study replications. Increasingly, methodological political scientists have recognized…

Peer Review of Social Science Research in Global Health

A new working paper by Victoria Fan, Rachel Silverman, David Roodman, and William Savedoff at the Center for Global Development. Abstract In recent years, the interdisciplinary nature of global health has blurred the lines between medicine and social science. As medical journals publish non-experimental research articles on social policies or macro-level interventions, controversies…

Replication in Economics Database

For scientific progress, it is pivotal to review research findings by independently replicating results, thus making the findings more reliable. However, in econometric research, it is not yet common practice to publish replication findings. Replication Wiki This wiki, developed by researchers at the University of Göttingen (Germany), compiles replications of empirical studies in economics.…

The Controversy of Preregistration in Social Research

Guest post by Jamie Monogan (University of Georgia) A conversation is emerging in the social sciences over the merits of study registration and whether it should be the next step we take in raising research transparency. The notion of study registration is that, prior to observing outcome data, a researcher can publicly…

Replicate it! A Proposal to Foster Knowledge Accumulation

Thad Dunning and Susan D. Hyde in the Washington Post: Like many social scientists, we take it almost as an article of faith that scientific methods will advance our knowledge about how the world works. The growing use by social scientists of strong research designs — for example, randomized controlled experiments or…

Flawed Research On Your Plate

You might want to reconsider paying extra dollar for these fish oil supplements. A new study said most of the research literature on the cardiovascular benefits of omega-3 fatty acids is flawed. In the early 1970s, two Danish researchers started to investigate the diet of Greenland’s Inuit populations, which were believed to live longer than their Caucasian counterparts. The study…

The Reformation: Can Social Scientists Save Themselves?

From Jerry Adler in the Pacific Standard—on the credibility crisis in social science research, publication bias, data manipulation, and non-replicability. Featuring BITSS aficionados Brian Nosek, Joe Simmons, Uri Simonsohn and Leif Nelson. Something unprecedented has occurred in the last couple of decades in the social sciences. Overlaid on the usual academic incentives of…

New book: Implementing Reproducible Research

New book from Victoria Stodden, Friedrich Leisch, and Roger D. Peng: “Implementing Reproducible Research“. In many of today’s research fields, including biomedicine, computational tools are increasingly being used so that the results can be reproduced. Researchers are now encouraged to incorporate software, data, and code in their academic papers so that others can…

Panel on Transparency and Replication @ EGAP 11 (Berkeley, CA — Friday 4/11)

Experiments in Governance and Politics (EGAP) will be holding its eleventh bi-annual meeting in Berkeley, CA this Friday and Saturday (April 11-12). In addition to research design workshops and recent papers presentations, the meeting will feature an interdisciplinary panel on transparency and replication (Friday, 3.50-6.00pm): 3:50 – 4:10 PM Thad Dunning “EGAP Regranting…

Replication Panels at ISA 2014

The 55th Annual Convention of the International Studies Association (March 26-29, 2014 – Toronto, Canada) will feature two panels on replication and reproducibility: Friday, March 28 8:15 AM – 10:00 AM Replication in International Relations: How Journal Data Policies and Replication in Teaching Can Improve Reproducibility Standards Friday, March 28 10:30 AM…

The changing face of psychology

Important changes are underway in psychology. Transparency, reliability, and adherence to scientific methods are the key words for 2014, says a recent article in The Guardian. A growing number of psychologists – particularly the younger generation – are fed up with results that don’t replicate, journals that value story-telling over truth, and an…

When is an error not an error?

Guest post by Annette N. Brown and Benjamin D. K. Wood on the World Bank Development Impact blog: We are seeing a similar propensity for replication researchers to use the word “error” (or “mistake” or “wrong”) and for this language to cause contentious discussions between the original authors and replication researchers. The…

Replication in Political Science

Here is a new initiative seeking to promote replications of quantitative work in political science. The group of researchers behind this project aims to create a site that will publish and organize replications done by graduate students in their courses — by which they mean the exercise of conducting re-analyses using the original data/code, as well…

Welcome To The Era of Big Replication

Reblogged from Ed Young: Psychologists have been sailing through some pretty troubled waters of late. They’ve faced several cases of fraud, high-profile failures to repeat the results of classic experiments, and debates about commonly used methods that are recipes for sexy but misleading results. The critics, many of whom are psychologists themselves,…

The New Statistics: A Pathway to Research Integrity

An eight-step strategy to increase the integrity and credibility of social science research using the new statistics, by Geoff Cumming. We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research…

New Standards for Research Reporting in Psychology

Psychological Science, the flagship journal of the Association for Psychological Science (APS), is introducing innovative new guidelines for authors, part of an effort to strengthen the reporting and analysis of findings in psychological research. Starting January 1, 2014, submitting authors will be required to state that they have disclosed all important methodological details,…

Research Transparency Landscape

A landscape of funder data access policies and other resources, by Stephanie Wykstra. New technology makes sharing research outputs– not just publications but also raw data, code, software, even lab notebooks – easier than ever before. The benefits from more open science are widely acknowledged. Yet there is still room for improvement:…