We share insights from a recent article published in the Proceedings of the National Academy of Sciences (PNAS), co-authored by CEGA/BITSS Faculty Co-Director Ted Miguel and several members of the BITSS community. The article discusses how researchers, funders, journal editors, and others can use the Reporting All Results Efficiently (RARE) framework to reduce publication bias and strengthen the credibility of social science research.
The burgeoning metascientific literature has taught us a lot about publication bias: perhaps most alarmingly, that statistically significant results—or results that confidently attribute a certain outcome to a specific intervention, rather than to chance—are 40% more likely to be published and 60% more likely to be written up than null results (showing no evidence that the intervention was effective). This bias has wasted research resources (50 to 80% of projects funded by the Food and Drug Administration and National Institutes of Health do not result in publication) and denied the scientific and policy communities access to otherwise useful (and potentially very important) evidence. For example, had a landmark study that found that community-driven development projects in Sierra Leone had no effect on developing local democratic institutions not been published, policymakers probably would have invested billions more in such programs.
Why are null results so rarely reported in social science research? And what can the scientific community do to reduce this type of publication bias?
In 2019, BITSS hosted a workshop called “Unlocking the File Drawer,” designed to generate consensus around these two important and deceptively simple questions. The workshop brought together researchers, journal editors, institutional review board (IRB) members, and partners across the research lifecycle to discuss underlying issues and brainstorm solutions. Insights from the workshop informed the development of a new framework for Reporting All Results Efficiently (RARE), recently published in the Proceedings of the National Academy of Sciences. RARE lays out a plan for reducing publication bias through systematic reporting of social science results.
Several tools introduced in the social sciences over the last decade—including hypothesis registries and preprints (sometimes referred to as “working papers”)—have made it easier for researchers to transparently share the plans for and results of their studies. However, while (pre-)registration is now slowly becoming the norm in most social sciences (partly driven by journal requirements), the results of many research projects remain stuck in the “file drawer,” never to be shared. And while over 300 journals now invite articles in the registered reports format (also referred to as “pre-results review”), where articles are accepted before the results are known, unfortunately, this is still far from enough to make a difference at scale. Based on our rough estimate, less than 11% of projects registered on the American Economic Association’s Registry for Randomized Controlled Trials (AEA RCT Registry) have reported any results after their stated end date. We suspect that the reason for this is a combination of the absence of norms around good reporting practices, lack of appropriate reporting infrastructure, and the absence of suitable incentives and enforcement mechanisms.
The RARE framework proposes a systematic approach to this critical issue.
RARE includes clear recommendations for five prominent actors in the scientific ecosystem: hypothesis registries, researchers, IRBs, research funders, and journals. The recommendations can be summarized as follows:
- Registries need to develop infrastructure to allow researchers to report hypotheses results and provide interpretation. Such infrastructure would be ideally paired with guidance on best practices for reporting results (e.g. in the form of reporting templates).
- In their IRB and funding applications, researchers need to describe their plans to share results with the research community and their research participants.
- Using their mandate of ensuring that research participants benefit from participating (also known as the ethical principle of beneficence), IRBs can serve as the first checkpoint for researchers’ plans.
- Research funders, who wield a great deal of power in promoting research norms can ask that grantees share past research outcomes as part of their application, while also providing additional support for reporting results (e.g. funding for research assistant time and storage, which per the forthcoming data policy, can be included in proposals to the National Institutes of Health).
- Finally, journals can use full reporting as a condition for acceptance and incentivize good reporting practices by making an explicit commitment to publish good quality research regardless of the nature of its results. Several innovations, such as registered reports (also known as “pre-results review” in economics) and null-results editorial statements, have contributed to more null results being published.
If adopted, we believe these reforms can dramatically increase the efficiency and completeness of reporting across the scientific ecosystem. Researchers will be less likely to duplicate research efforts that were never published (perhaps because they yielded null results), thus saving time, energy, and financial resources. Researchers will be able to report all results free of concerns that a null result may ruin the chance of publication. This new equilibrium will benefit evidence-based decision-making by giving analysts and policymakers access to a more complete body of evidence.
The BITSS team at CEGA is eager to develop consensus around the RARE framework. If you have thoughts, questions, or suggestions, we welcome them at email@example.com. In the coming months and years, we hope to work with our partners to invest in tools for full reporting by developing and testing guidance and working with registries to develop the necessary infrastructure. As we write in our article, our goal is to advance knowledge and make it easy to do so by working toward “a world in which less research funding is wasted, policy decisions are based on the most complete and unbiased evidence possible, and where scientific expertise is more widely valued and trusted by the public.”