Rachel Glennerster, Perspective from Economics
This is the first post of a video series in which we ask leading social science academics and experts to discuss research transparency in their discipline. This interview was recorded on December 12, 2013 at the University of California, Berkeley.
Full Transcript:
I’m Rachel Glennerster. I’m Executive Director of the The Abdul Latif Jameel Poverty Action Lab, which is a center at MIT. I do randomized evaluations of social and economic programs. A lot of my work has been on Sierra Leone, but I’ve also done work in India, Bangladesh, and Pakistan.
What explains your interest in research transparency?
Part of my interest in the transparency work comes from my involvement in 4 pre-analysis plans, where we committed in advanced how we were going to analyze the data. This is, I think, a useful tool, particularly in areas where you have a very large number of outcome variables and there is not an obvious one which is the obvious outcome measure. I got involved in this doing some work on community-driven development where we had 310+ outcome measures. There were different ways of measuring social capital and it was not obvious that there was one single measure — we needed all of them. Thinking ahead of time about how we were going to group these indicators was very important.
What have you learned from using pre-analysis plans?
I have been involved in doing this kind of work now a number of times and realized just how complicated and difficult it is. One of the projects that I worked on we had 7 different arms. Trying to write in advance exactly how you are going to analyze this, without knowing which arms were going to be positive, negative, or insignificant, really turned out to be a complete nightmare. There were things we wished we had done differently, which led to my next attempt to do this kind of transparency of analysis where we really came in a different way: writing down how we do the first step and then doing that, and then using that information from the first step to write how we are going to analyze the next bit of data. We split it up in these different approaches, which is one of the things I am going to talk about at this conference [“BITSS 2013 Annual Meeting”, ed.], but I think that learning about how to be transparent is really evolving.
What do you feel is the most important topic today in research transparency?
I think that this issue of how do you commit in advance to analyzing your data, when and how you should do it is something that is evolving — and I think we need lots of discussions about that. I also think that it is very important that we get more data from our experiments out and published. I think that everyone agrees that we ought to be doing this, but setting up systems to make it easier is very important. There is an enormous amount of work and we do not have the right incentives to do it. Sometimes funders insist that we have to commit to publish our data and that is a very helpful incentive. But there is more we can do — and though this is not my expertise, we have people within J-PAL who are thinking about it — such as making the data collection process involved steps which then make it easier to publish later. Basically, anything we can do to make it easier is going to be a huge part of getting more data out on the web.
Where does BITSS fit in the picture?
One thing which is pretty unique in BITSS is that it is cross-disciplinary – across the social sciences. I don’t normally get to talk to people working in other areas of social science, except political science where they are other fora where we discuss. I think this is really helpful to understand the guidelines and ways of working that other disciplines are taking on, so that we can try to see what economics is doing now. I can’t speak for economists in general, but I think that these discussions are very important to establish similar approaches. Sometimes we have similar objectives across the social sciences, but because we come at them in different ways, we can’t talk to each other. We say “they don’t do things right, because they are not doing it all way”, while actually there is not much differences between them. Deliberately getting us to sit together and hash out to the extent that we can to come with guidelines and common approaches is really helpful. If we are all going to move forward on transparency, it’s very important to build consensus within the professions that this is important and this is how we should approach the questions.