Reporting Guidelines to Reduce Publication Bias (RGPB) BITSS Scholars

Fernando Hoces Edward Miguel Erik Sørensen Bertil Tungodden

The growing demand for evidence-based policy has created an imperative to increase the rigor, reliability, and transparency of scientific research. The project aims to contribute significantly to the understanding of publication bias in social science research, addressing the issue of incomplete or inconsistent reporting 

Our team developed an approach to standardize and record the hypotheses of studies registered in the American Economic Association (AEA) RCT Registry. This study is among the largest follow-up studies in economics aimed at measuring publication bias and selective reporting, and is novel in that it focuses on results at the research hypothesis level. 

Our innovative approach to standardized reporting of research results in economics – a practice already widely used in other fields, including medicine and public health – could be of interest to journals, funders, and professional associations. This project will make several high-impact contributions to understanding the nature of publication bias in economics, including more precise study-level estimates of publication bias and a better understanding of the benefits of different forms of support for reporting guideline usage. This study also seeks to reduce it by recovering the results of missing studies. 

Reporting all Results Efficiently (RARE) BITSS Scholars

Edward Miguel David Laitin

While the social sciences have made impressive progress in adopting transparent research practices that facilitate verification, replication, and reuse of materials, the problem of publication bias persists. Bias on the part of peer reviewers and journal editors, as well as the use of outdated research practices by authors, continues to skew literature toward statistically significant effects, many of which may be false positives. To mitigate this bias, we propose a framework to enable authors to report all results efficiently (RARE), with an initial focus on experimental and other prospective empirical social science research that utilizes public study registries. This framework depicts an integrated system that leverages the capacities of existing infrastructure in the form of public registries, institutional review boards, journals, and granting agencies, as well as investigators themselves, to efficiently incentivize full reporting and thereby, improve confidence in social science findings. In addition to increasing access to the results of scientific endeavors, a well-coordinated research ecosystem can prevent scholars from wasting time investigating the same questions in ways that have not worked in the past and reduce wasted funds on the part of granting agencies.

Publications associated with this project:

  • Laitin, David D., Edward Miguel, Ala’ Alrababa’h, Aleksandar Bogdanoski, Sean Grant, Katherine Hoeberling, Cecilia Hyunjung Mo, et al. “Reporting All Results Efficiently: A RARE Proposal to Open up the File Drawer.” Proceedings of the National Academy of Sciences 118, no. 52 (December 28, 2021). https://doi.org/10.1073/pnas.2106178118.

Evidence on Research Transparency in Economics BITSS ScholarsEconomics

Edward Miguel

A decade ago, the term “research transparency” was not on economists’ radar screen, but in a few short years a scholarly movement has emerged to bring new open science practices, tools and norms into the mainstream of our discipline. The goal of this article is to lay out the evidence on the adoption of these approaches—in three specific areas: open data, pre-registration and pre-analysis plans, and journal policies—and, more tentatively, begin to assess their impacts on the quality and credibility of economics research. The evidence to date indicates that economics (and related quantitative social science fields) are in a period of rapid transition toward new transparency-enhancing norms. While solid data on the benefits of these practices in economics is still limited, in part due to their relatively recent adoption, there is growing reason to believe that critics’ worst fears regarding onerous adoption costs have not been realized. Finally, the article presents a set of frontier questions and potential innovations.

Publications associated with this project:

The State of Social Science (3S) Survey BITSS ScholarsEconomicsInterdisciplinaryPolitical SciencePsychologySociology

Edward Miguel Garret Christensen Elizabeth Levy Paluck

Has there been a meaningful movement toward open science practices within the social sciences in recent years? Discussions about changes in practices such as posting data and pre-registering analyses have been marked by controversy—including over the extent to which change has taken place. This study, based on the State of Social Science (3S) Survey, provides the first comprehensive assessment of awareness of, attitudes towards, perceived norms regarding, and adoption of open science practices within a broadly representative sample of scholars from four major social science disciplines: economics, political science, psychology, and sociology. We observe a steep increase in adoption: as of 2017, over 80% of scholars had used at least one such practice, rising from one quarter a decade earlier. Attitudes toward research transparency are on average similar between older and younger scholars, but the pace of change differs by field and methodology. According to theories of normal science and scientific change, the timing of increases in adoption coincides with technological innovations and institutional policies. Patterns are consistent with most scholars underestimating the trend toward open science in their discipline.

Publications associated with this project:

  • Christensen, Garret, Zenan Wang, Elizabeth Levy Paluck, Nicholas Swanson, David J. Birke, Edward Miguel, and Rebecca Littman. “Open Science Practices Are on the Rise: The State of Social Science (3S) Survey.” Preprint. MetaArXiv, October 18, 2019. https://doi.org/10.31222/osf.io/5rksu.
  • Ferguson, J., Littman, R., Christensen, G. et al. Survey of open science practices and attitudes in the social sciences. Nat Commun 14, 5401 (2023). https://doi.org/10.1038/s41467-023-41111-1

A Framework for Open Policy Analysis BITSS ScholarsInterdisciplinaryPublic Policy

Fernando Hoces Sean Grant Edward Miguel

The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While this movement has gained traction over the last two decades, several concerns about the credibility of empirical research have been identified in scientific disciplines that use research methods and practices that are commonplace in policy analysis. As a solution, Hoces and colleagues argue that policy analysis should adopt the transparent, open, and reproducible research practices espoused in related disciplines.

Publications associated with this project:

  • Hoces de la Guardia, Fernando, Sean Grant, and Edward Miguel. “A Framework for Open Policy Analysis.” Science and Public Policy, no. scaa067 (December 3, 2020). https://doi.org/10.1093/scipol/scaa067. An open-access version is available here.

The Impact of Data Sharing on Article Citations BITSS ScholarsEconomicsPolitical Science

Garret Christensen Allan Dafoe Edward Miguel Don Moore Andrew K. Rose

This study estimates the effect of data sharing on the citations of academic articles, using journal policies as a natural experiment. We begin by examining 17 high-impact journals that have adopted the requirement that data from published articles be publicly posted. We match these 17 journals to 13 journals without policy changes and find that empirical articles published just before their change in editorial policy have citation rates with no statistically significant difference from those published shortly after the shift. We then ask whether this null result stems from poor compliance with data sharing policies, and use the data sharing policy changes as instrumental variables to examine more closely two leading journals in economics and political science with relatively strong enforcement of new data policies. We find that articles that make their data available receive 97 additional citations (estimate standard error of 34). We conclude that: a) authors who share data may be rewarded eventually with additional scholarly citations, and b) data-posting policies alone do not increase the impact of articles published in a journal unless those policies are enforced.

Find the most recent version of this paper here.

Transparency and Reproducibility in Economics Research BITSS ScholarsEconomics

Edward Miguel Garret Christensen

There is growing interest in research transparency and reproducibility in economics and other fields. We survey existing work on these topics within economics and discuss the evidence suggesting that publication bias, inability to replicate and specification searching remain widespread problems in the discipline. We next discuss recent progress in this area, including through improved research design, study registration and pre-analysis plans, disclosure standards, and open sharing of data and materials, and draw on the experience in both economics and other social science fields. We conclude with a discussion of areas where consensus is emerging on the new practices, as well as approaches that remain controversial and speculate about the most effective ways to make economics research more accurate, credible and reproducible in the future.

Find the most recent version of this paper here.

Conservative Tests under Satisficing Models of Publication Bias BITSS ScholarsMetascience (Methods and Archival Science)Social Science

Justin McCrary Garret Christensen Daniele Fanelli

Publication bias leads consumers of research to observe a selected sample of statistical estimates calculated by producers of research. We calculate critical values for statistical significance that undo the distortions created by this selection effect, assuming that the only source of publication bias is file drawer bias. These adjusted critical values are easy to calculate and differ from unadjusted critical values by approximately 50%—rather than rejecting a null hypothesis when the t-ratio exceeds 2, the analysis suggests rejecting a null hypothesis when the t-ratio exceeds 3. Samples of published social science research indicate that on average, across research fields, 30% of published t-statistics fall between the standard and adjusted cutoffs.

Find the most recent version of this paper here.

Many Analysts, One Dataset: Making Transparent How Variations in Analytical Choices Affect Results BITSS ScholarsInterdisciplinary

Garret Christensen

In a standard scientific analysis, one analyst or team presents a single analysis of a data set. However, there are often a variety of defensible analytic strategies that could be used on the same data. Variation in those strategies could produce very different results.

In this project, we introduce the novel approach of “crowdsourcing a dataset.” We hope to recruit multiple independent analysts to investigate the same research question on the same data set in whatever manner they see as best. This approach should be especially useful for complex data sets in which a variety of analytic approaches could be used, and when dealing with controversial issues about which researchers and others have very different priors. If everyone comes up with the same results, then scientists can speak with one voice. If not, the subjectivity and conditionality on analysis strategy are made transparent.

This first project establishes a protocol for the independent simultaneous analysis of a single dataset by multiple teams and resolution of the variation in analytic strategies and effect estimates among them. The research question for this first attempt at crowdsourcing is as follows: Are soccer referees more likely to give red cards to dark skin toned players than light skin toned players?

Find the most recent version of Christensen’s analysis here. Learn more about the project here.

Promoting an Open Research Culture BITSS ScholarsInterdisciplinary

Brian Nosek

The Transparency and Openness Promotion (TOP) Committee met at the Center for Open Science in Charlottesville, Virginia, in November 2014 to address one important element of the incentive systems: journals’ procedures and policies for publication. The committee consisted of disciplinary leaders, journal editors, funding agency representatives, and disciplinary experts largely from the social and behavioral sciences. By developing shared standards for open practices across journals, we hope to translate scientific norms and values into concrete actions and change the current incentive structures to drive researchers’ behavior toward more openness. Although there are some idiosyncratic issues by discipline, we sought to produce guidelines that focus on the commonalities across disciplines.

Find the most recent version of this paper here.

Contributors to this project include: B. A. Nosek, G. Alter, G. C. Banks, D. Borsboom, S. D. Bowman, S. J. Breckler, S. Buck, C. D. Chambers, G. Chin, G. Christensen, M. Contestabile, A. Dafoe, E. Eich, J. Freese, R. Glennerster, D. Goroff, D. P. Green, B. Hesse, M. Humphreys, J. Ishiyama, D. Karlan, A. Kraut, A. Lupia, P. Mabry, T. Madon, N. Malhotra, E. Mayo-Wilson, M. McNutt, E. Miguel, E. Levy Paluck, U. Simonsohn, C. Soderberg, B. A. Spellman, J. Turitto, G. VandenBos, S. Vazire, E. J. Wagenmakers, R. Wilson, and T. Yarkoni.

Promoting Transparency in Social Science Research BITSS ScholarsInterdisciplinarySocial Science

Edward Miguel Colin Camerer Kate Casey J. Cohen Kevin Esterling Alan Gerber Rachel Glennerster Donald P. Green Macartan Humphreys Guido Imbens Temina Madon Leif Nelson Brian Nosek Maya Petersen Richard Sedlmayr Joseph Simmons Mark van der Laan

There is a growing appreciation for the advantages of experimentation in the social sciences. Policy-relevant claims that in the past were backed by theoretical arguments and inconclusive correlations are now being investigated using more credible methods. Changes have been particularly pronounced in development economics, where hundreds of randomized trials have been carried out over the last decade. When experimentation is difficult or impossible, researchers are using quasi-experimental designs. Governments and advocacy groups display a growing appetite for evidence-based policy-making. In 2005, Mexico established an independent government agency to rigorously evaluate social programs, and in 2012, the U.S. Office of Management and Budget advised federal agencies to present evidence from randomized program evaluations in budget requests.

Accompanying these changes, however, is a growing sense that the incentives, norms, and institutions under which social science operates undermine gains from improved research design. Commentators point to a dysfunctional reward structure in which statistically significant, novel, and theoretically tidy results are published more easily than null, replication, or perplexing results. Social science journals do not mandate adherence to reporting standards or study registration, and few require data-sharing. In this context, researchers have incentives to analyze and present data to make them more “publishable,” even at the expense of accuracy. Researchers may select a subset of positive results from a larger study that overall shows mixed or null results or present exploratory results as if they were tests of prespecified analysis plans. These practices, coupled with limited accountability for researcher error, have the cumulative effect of producing a distorted body of evidence with too few null effects and many false-positives, exaggerating the effectiveness of programs and policies. Even if errors are eventually brought to light, the stakes remain high because policy decisions based on flawed research affect millions of people.

In this article, we survey recent progress toward research transparency in the social sciences and make the case for standards and practices that help realign scholarly incentives with scholarly values. We argue that emergent practices in medical trials provide a useful, but incomplete, model for the social sciences. New initiatives in social science seek to create norms that, in some cases, go beyond what is required of medical trials.

Find the most recent version of this paper here. Find an open access version here.