Research Transparency Meeting with CGD

By Garret Christensen (BITSS)


Though BITSS hopes to increase research transparency across the social sciences, several of us, myself included, have a background in development economics. So we were happy to take part in a meeting last week at the Center for Global Development (CGD) in Washington, DC. In addition to BITSS and CGD, representatives from the International Initiative for Impact Evaluation (3ie), Inter-American Development Bank, InterAction, Innovations for Poverty Action Lab (IPA), Millennium Challenge Corporation (MCC), World Bank research group, United States Agency for International Development (USAID), and the US Treasury were present.

I was impressed by how much agreement there was, and how interested these large, sometimes slow-moving, organizations seemed to be, but I should probably temper my enthusiasm a bit: the people in the room were not randomly selected from their respective agencies, and even if they had been, we may still be far from actual policy changes and wider adoption. Regardless, we had a fruitful discussion about some of the roadblocks on the way to increased transparency.

Here are a few of the themes we discussed, mostly obstacles to increased transparency:

  1. Bill Savedoff and Michael Clemens from CGD suggested we need to increase the demand for transparency if we expect people to supply it. So if we want people to share their data, or to write exhaustive appendices, original surveys, and protocols so that others can reproduce their work, we should actively be asking for, reading, using, or citing the protocols that are out there. Others suggested we needed to change norms so that transparency becomes the default for how research is done. I was a little surprised that in a room with so many economists, I actually had to be the one to say that money could pretty easily create norms: if funders held back the last 10% of their grant until data and code was online, then boom, we’re done, data gets shared.
  2. Justin Sandefur and David McKenzie suggested that adoption of state-of-the-art methods is fairly common among the RCT crowd, but there’s been less buy in from observational researchers. Justin also suggested we should decrease the length of time authors are granted a monopoly on their collected data.
  3. Representatives from big development project implementers such as the World Bank and the Millennium Challenge Corporation have a hard time actually defining what their interventions are, let alone evaluating them. How can you pre-specify a hypothesis if you’re constantly re-evaluating what the intervention is or should be? The phrase “a project in search of a plan” was used to describe the process of identifying an intervention and a evaluation design.
  4. Replication to scientists is like taxes to liberals, said 3ie’s Annette Brown. Great in theory, but President Obama’s brief interest in taxing 529 education plans shows that getting approval in reality is often much more difficult. 3ie has been running a replication project and it has faced a few issues: low interest from researchers and publishers, longer than expected duration, and maybe some researchers unhappy with the process. Given that it’s something so many researchers say they want, what is the right role for a 3rd party to play so that replications become more prevalent?

If you have any suggestions on solutions to these issues, we’d love to hear them! Attendees expressed interest in a larger meeting to continue discussion and bring more funders and implementers to the table, possibly in the fall. We’ll keep you posted.