By Guillaume Kroll (CEGA)
Two years ago, in December 2012, a handful of researchers convened in Berkeley to discuss emerging strategies to increase openness and transparency in social science research. The group’s concerns followed a number of high-level cases of scientific misconduct and unethical practices, particularly in psychology (1,2). As researchers started to question the legitimacy of the “publish or perish” structure governing academia, many decided to replicate influential findings in their field to differentiate the rigorous from the untrustworthy… only to find that a large majority of studies were unreproducible.
This observation triggered an unprecedented number of bottom-up innovations to restore the credibility of scientific evidence across social science disciplines, including the use of study registries, pre-analysis plans, data sharing, and result-blind peer review. The 2012 meeting resulted in the creation of BITSS, with the goal of fostering the adoption of more transparent research practices among the scientific community.
Today, BITSS has more than 150 affiliated researchers and partner institutions committed to improving the standards of rigor and integrity across the social sciences. Last week’s third annual BITSS meeting was a good opportunity to reflect on the progress achieved.
What has changed over the past two years?
- Transparency practices are slowly becoming mainstream. The AEA registry for randomized trials in economics has 293 studies registered and the EGAP registry in political science has 103. The Open Science Framework has 7,150 users (average of 14.2 new every day) and other data sharing platforms like Dataverse are growing too. Massive replication projects are getting traction. Journals are realizing the central role they have to play, and are changing their publication requirements accordingly (1,2,3,4,5).
- It has never been so easy to be transparent. There is an increasing number of tools and resources, tailored to all sorts of audiences, including a profusion of data repositories, reporting standards (1,2), ready-to-paste statements for disclosing data manipulations, and free open source tools to support collaborative research workflows. And in a couple of months, BITSS will publish a manual of best practices for transparent research.
- Replication efforts are here to last, and this is a good thing. Faced with concerns over widespread irreproducibility of published findings, psychology is slowly cleaning up its act. Brian Nosek’s Center for Open Science is spearheading many of these efforts, and there is no reason economics, political science, and other social science disciplines wouldn’t want to follow the same path.
- They are still critics of transparency, but at least people are talking about it. Not everyone is convinced by the need to pre-register, write a pre-analysis plan, or share data, and this is a good thing. As new practices emerge, they need to be debated and refined until they reach a certain level of consensus.
What did we learn at this year’s meeting?
- Meta-research is getting more attention. From Neil Malhotra’s new insights on the file drawer problem, to Victoria Stodden’s computational approaches to reproducibility, John Ioannidis’ pioneering work in biomedicine, or Uri Simonsohn’s upcoming “false-positive economics”, an increasing number of studies are shedding light on the scope of transparency issues. These contributions are particularly important, as they allow us to direct efforts to where they are most needed (pain points).
- The lack of transparency is a major threat to scientific progress. This should be clear by now. “You are 60% likely to find statistically significant results for a false-positive after a couple of simple data manipulations” (Simonsohn), “Only 20% of social science experiments with null results ever get published, as compared to 50% for studies with strong or moderate results […] Worst, 65% of null results studies never even get written up” (Malhotra), “Only 27% of the computations in scientific articles published in Science are reproducible” (Stodden). These are just a couple of findings from recent meta-research studies. The problem is that dubious or opaque research practices, coupled with publication bias, create a distorted and untrustworthy body of scientific “evidence”.
- But there is a real movement to enact change. From BITSS to the Center for Open Science, DA-RT, EGAP, Michigan’s ICPSR, Berkeley’s D-Lab, Haverford’s Project TIER, Stanford’s METRICS, Harvard’s IQSS, the California Digital Library, ReplicationWiki, and funders like the Alfred P. Sloan and Laura and John Arnold Foundations, there is now an active ecosystem of projects and institutions behind the transparency agenda, and they are all gaining momentum.
- It’s not only about academia. They are many organizations carrying research outside the academe, such as think-tanks and government agencies, whose researchers are subject to a different set of perverse incentives. Jack Molyneaux showed us how his organization (the Millenium Challenge Corporation) managed to make all of its impact evaluation data public while protecting the privacy of research subjects.
What the best way forward? (That’s the hard one.)
- Convincing the mass vs. focusing on the elite? One of the outstanding question is how to get people adopt best practices. The traditional view is to make all researchers understand the intrinsic value of transparency, trying to convince them that a few investments upfront will go a long way in ultimately becoming a better and more successful scholar. But maybe that’s too hard, or not even necessary? That’s what Skip Lupia and Colin Elman believe. Their DA-RT initiative targets change agents, in this case political science journal editors, to have them adopt new requirements for publication in their outlet. Once you get the leading researchers on board (who are also often journal editors), you can have massive traction, according to DA-RT.
- Training the next generation of researchers. It’s no surprise than a successful, tenured researcher in a top-tier university has less incentives to make his long-existing workflow more open and transparent than a graduate student contemplating an academic career. BITSS has taken the lead in equipping the researchers of tomorrow with the tools they need to stand out in the academic crowd (which we believe is called transparency).
- The incentive structure has to change. Yet for the above to work, researchers should be rewarded for being transparent, which is not yet a reality in the current system. Skeptics argue that (at least in the short-term) researchers still have more incentives to p-hack and distort their findings in order to get published, than following transparent practices. This needs to change. Two landmark efforts are the COS system of badges, and innovator prizes (which BITSS is currently considering).
- Building a better understanding of how to integrate transparency in teaching. As research transparency topics are slowing making it into the mainstream teaching of economics, political science, and psychology, an increasing number of scholars are gathering valuable insights about how to teach these new practices. We need these people to share teaching material, experiences, and best practices.
- Building a consensus about how to do replications. The value of replications is becoming clear to everyone, but how to ensure these are conducted rigorously with the objective of advancing science instead of unnecessarily (and sometimes erroneously) blaming original authors whose work failed to be replicated, is still far from obvious. Recent controversies in psychology call for caution and further efforts developing standards in this area.
In 2015, BITSS will keep playing a leading role answering these outstanding questions and advocating for more open and reproducible science. Our upcoming manual of best practices will be an important milestone in this direction. In the meantime, you can consult all the material presented at our annual meeting on the OSF, and tune in to our youtube channel for forthcoming videos of the conference. Finally, if you attended the annual meeting or followed the livestream (or just have ideas or comments) we are always eager to receiving feedback. May 2015 be a year of transparent, rigorous, and responsible research.