A pre-analysis plan example: A Sierra Leone study on GoBifo local institutional reforms

A few years after the end of a violent civil war in Sierra Leone, three researchers from the Abdul Latif Jameel Poverty Action Lab (J-PAL) and UC Berkeley, myself included (Prof. Miguel, here!) carried out a Randomized Controlled Trial (RCT) to evaluate the impacts of an intervention that sought to strengthen public participation in governance institutions. The intervention, called GoBifo, involved cash transfers, trainings, and outreach to historically marginalized groups. Their pre-registered study demonstrated some positive impacts of the project, but not necessarily the intended benefits.

What makes our use of a pre-analysis plan more interesting is that a simultaneous, qualitative study carried out by the World Bank found much more positive impacts. The World Bank study, however, was not registered, calling into question whether or not they selectively reported results. Which do you think is more valid?

This article is a good example of when a pre-analysis plan was useful. Katherine Casey, Rachel Glennerster, and I exploited the random assignment and participation of communities in a governance program to evaluate its impact. Our use of a pre-analysis plan turned out to be very valuable in improving the validity of our findings.

We set out to assess whether a “community-driven development” project, implemented in post-conflict Sierra Leone, improved local economic outcomes and institutional capacity in the form of democratized action, decision-making, and inclusivity. We found short-run positive effects on local economic outcomes, but no evidence of strengthened institutions.

In the paper, we discuss the general benefits of using a pre-analysis plan, or PAP:

“While the experimental framework naturally imposes some narrowing of econometric specifications, there is still considerable flexibility for researchers to define the outcome measures of interest, group outcome variables into different hypothesis ‘families’ or domains, identify population subgroups to test for heterogeneous effects, and include or exclude covariates. PAPs are arguably particularly valuable, therefore, when there are a large number of plausible outcome measures of interest and when researchers plan to undertake subgroup analysis.”

Moreover, “[t]he process of writing a PAP may have the side benefit of forcing the researchers to more carefully think through their hypotheses beforehand, which in some cases could improve the quality of the research design and data collection approach.”

We were careful to include some of the risks and trade-offs involved in using a PAP, including the concern “that important hypotheses will be omitted from the initial plan,” and “that the exact econometric specification laid out in advance does not describe the data as well as one that would have been chosen ex post if the authors had first ‘let the data speak,’ potentially leading to less precise estimates.”

We offer suggestions of ways to mitigate these risks and “advocate a compromise position that allows some researcher flexibility accompanied by the ‘price tag’ of full transparency — including a paper trail of exactly what in the analysis was prespecified and when, and public release of data so that other scholars can replicate the analysis — with the hope that this approach will foster the greatest research progress.

Overall, the value of using a PAP is clear, especially when the results of this registered study are compared with those of an unregistered evaluation of the same program. As explained in the previous video, an unregistered study carried out by the World Bank found more positive impacts than did we. It is difficult to tell, however, if selective reporting or specification searching was involved in the analysis.

You can read the whole paper, the Pre-analysis plan, and a summary of our evaluation for J-PAL by clicking on the links in the SEE ALSO section at the bottom of this page.


Casey, Katherine, Rachel Glennerster, and Edward Miguel. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan.” The Quarterly Journal of Economics 127 (4): 1755–1812.