Open Policy Analysis

Debate is an inherent aspect of policymaking. On issues as diverse as raising the minimum wage, providing universal health care, and taxing carbon emissions, policymakers rely on research and policy analysis to support their positions and forward decisions. Policy analysis involves empirical analysis conducted by government agencies, think tanks, and academics to evaluate the impacts of policies and reforms on people and the economy.

However, current practices in policy analysis often fall short of espousing the principles of reproducibility and transparency, when analysts fail to fully disclose key aspects of research, data, and guesswork that go into a policy report. This makes it easy for actors on different sides of the debate to cherry-pick facts, models, and estimates to forward their beliefs within a specific report, calling into question the evidence-based character of the policy making process. Opening up the policy analysis process can allow for critical appraisal of analyses within reports.

To strengthen transparency and reproducibility in policy analysis, BITSS proposes Open Policy Analysis (OPA), an approach to policy analysis that incorporates open science tools and practices (see Figure 1 below). OPA is based on the following principles:

  1. Computational Reproducibility – All materials (raw data, code, and supporting documents) should be made available to allow a policy report to be reproduced with minimal effort. This principle incorporates the use of literate programming, version control, and standardized file structure and labeling.
  2. Analytic Transparency – All elements of the analysis should be easily accessible and readable for critical appraisal and improvement. This also entails disclosing all methodological procedures and underlying assumptions behind the report. Analytic transparency can be achieved through the use of open data, code and materials, and Dynamic Documents.
  3. Output Transparency – The structure and format of policy reports (particularly tables and visualizations) should be pre-specified and reviewed before the results of the analysis are known, and should be consistently applied across different analyses and iterations of the same report. Output transparency ensures that experts and policy makers judge the merits of the report independently of the results. Open science tools and practices related to output transparency include the use of reporting guidelines, preprint services, and open access publishing.

 

Figure 1: The workflow of Open Policy Analysis. All components of the policy analysis (Research, Data, Guesswork, and Model) are open and reproducible; relevant results and its format are critically examined before publishing the final estimates; and policymakers use the same part of the report to support their decisions. When a similar report is needed a few years later, the starting point is the original (reproducible) report and all the added (+) and subtracted elements (-) are disclosed.

If you’re interested in learning how to bring OPA to your institution or would like to nominate a policy report, please contact BITSS Postdoctoral Scholar Fernando Hoces de la Guardia (fhoces@berkeley.edu) with any questions.