Why We Need Open Policy Analysis BITSS ScholarsInterdisciplinary

Fernando Hoces Sean Grant Edward Miguel

The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While this movement has gained traction over the last two decades, several concerns about the credibility of empirical research have been identified in scientific disciplines that use research methods and practices that are commonplace in policy analysis. As a solution, we argue that policy analysis should adopt the transparent, open, and reproducible research practices espoused in related disciplines. Read More →

Transparent and Reproducible Social Science Research (Textbook) BITSS ScholarsEconomicsPolitical SciencePsychology

Garret Christensen Edward Miguel

Psychology, political science, and economics have all recently taken their turn in the spotlight with instances of influential research that fell apart when scrutinized. Beyond scandals featuring deliberate fraud, there is growing evidence that much social science research features sloppy yet inadvertent errors, and a sense that many analyses produce statistically “significant” results only by chance. Due in part to a rising number of highly publicized cases, there is growing demand for solutions. A movement is emerging across social science disciplines, and especially in economics, political science and psychology, for greater research transparency, openness, and reproducibility. Our textbook, Transparent and Reproducible Social Science Research, will be the first to crystallize the new insights, practices and methods in this growing interdisciplinary field.

Data Sharing and Citations: Causal Evidence BITSS ScholarsEconomicsPolitical Science

Garret Christensen Edward Miguel Allan Dafoe

This project attempts to estimate the causal effect of data-sharing on citations. There is a fair amount of evidence that published academic papers that make their data publicly available have, on average, a higher number of citations, but ours is the first evidence that attempts to address the causal nature of this relationship.

Transparency and Reproducibility in Economics Research BITSS ScholarsEconomics

Edward Miguel Garret Christensen

There is growing interest in research transparency and reproducibility in economics and other fields. We survey existing work on these topics within economics, and discuss the evidence suggesting that publication bias, inability to replicate, and specification searching remain widespread problems in the discipline. We next discuss recent progress in this area, including through improved research design, study registration and pre-analysis plans, disclosure standards, and open sharing of data and materials, and draw on the experience in both economics and other social science fields. We conclude with a discussion of areas where consensus is emerging on the new practices, as well as approaches that remain controversial, and speculate about the most effective ways to make economics research more accurate, credible and reproducible in the future.

Conservative Tests under Satisficing Models of Publication Bias BITSS ScholarsEconomicsSocial Science

Justin McCrary Garret Christensen Daniele Fanelli

Publication bias leads consumers of research to observe a selected sample of statistical estimates calculated by producers of research. We calculate critical values for statistical significance that undo the distortions created by this selection effect, assuming that the only source of publication bias is file drawer bias. These adjusted critical values are easy to calculate and differ from unadjusted critical values by approximately 50%—rather than rejecting a null hypothesis when the t-ratio exceeds 2, the analysis suggests rejecting a null hypothesis when the t-ratio exceeds 3. Samples of published social science research indicate that on average, across research fields, 30% of published t-statistics fall between the standard and adjusted cutoffs.

Publication available at PLOS ONE here.

Many analysts, one dataset: Making transparent how variations in analytical choices affect results BITSS ScholarsInterdisciplinary

Garret Christensen

In a standard scientific analysis, one analyst or team presents a single analysis of a data set. However, there are often a variety of defensible analytic strategies that could be used on the same data. Variation in those strategies could produce very different results.

In this project, we introduce the novel approach of “crowdsourcing a dataset.” We hope to recruit multiple independent analysts to investigate the same research question on the same data set in whatever manner they see as best. This approach should be especially useful for complex data sets in which a variety of analytic approaches could be used, and when dealing with controversial issues about which researchers and others have very different priors. If everyone comes up with the same results, then scientists can speak with one voice. If not, the subjectivity and conditionality on analysis strategy is made transparent. Read More →

Promoting an open research culture BITSS ScholarsInterdisciplinary

Transparency, openness, and reproducibility are readily recognized as vital features of science (1, 2). When asked, most scientists embrace these features as disciplinary norms and values (3). Therefore, one might expect that these valued features would be routine in daily practice. Yet, a growing body of evidence suggests that this is not the case (46). Read More →

Promoting Transparency in Social Science Research BITSS ScholarsInterdisciplinarySocial Science

Edward Miguel Colin Camerer Kate Casey J. Cohen Kevin Esterling Alan Gerber Rachel Glennerster Donald P. Green Macartan Humphreys Guido Imbens Temina Madon Leif Nelson Brian Nosek Maya Petersen Richard Sedlmayr Joseph Simmons Mark van der Laan

There is growing appreciation for the advantages of experimentation in the social sciences. Policy-relevant claims that in the past were backed by theoretical arguments and inconclusive correlations are now being investigated using more credible methods. Changes have been particularly pronounced in development economics, where hundreds of randomized trials have been carried out over the last decade. When experimentation is difficult or impossible, researchers are using quasi-experimental designs. Governments and advocacy groups display a growing appetite for evidence-based policy-making. In 2005, Mexico established an independent government agency to rigorously evaluate social programs, and in 2012, the U.S. Office of Management and Budget advised federal agencies to present evidence from randomized program evaluations in budget requests (1, 2). Read More →