Garret Christensen–BITSS Project Scientist
BITSS has recently launched an initiative to expand research transparency efforts in developing countries. I think this is a very good fit for BITSS–we are housed within the Center for Effective Global Action (CEGA), our faculty lead Ted Miguel is a preeminent development economist, our program director Jen Sturdy managed impact evaluations at the World Bank and the Millennium Challenge Corporation for years before joining our team, and I was trained as a development and labor economist in my PhD program.
One of our first efforts along these lines was for me to visit Nairobi this past week and present to two groups who quite conveniently happened to be holding their research conferences in the same hotel. While the ~48 hours of air travel necessary to go from California to Nairobi for only a three day trip may seem a bit much (especially if the in-flight entertainment system’s copy of Finding Nemo cuts out right at the climax of the film) presenting to both groups was a wonderful opportunity. The first group was an impact evaluation summit organized by 3ie and the Alliance for a Green Revolution in Africa (AGRA), with attendees from the Bill & Melinda Gates Foundation and the MasterCard Foundation, among others. Here I presented a brief introduction to research transparency, which you can find here. I then presented a more in-depth summary to 18 researchers brought together by the African Population and Health Research Center (APHRC). I presented five sessions, or two half-days, as part of a three-day training on research management and governance to researchers and research managers from universities and organizations from all over Kenya and Uganda.
My presentation sessions were:
- Introduction to research transparency.
- Registration and pre-analysis plans as solutions to the problems of publication bias and specification searching.
- Software tools for a reproducible workflow.
- Data-sharing and replication.
- Wrap-up: overview of current events, grants, opportunities and next steps in research transparency.
I have to look at my feedback more closely, but it seemed pretty clear that it was a bit like trying to sip from a fire hose. For example, in my session on software, I think key reproducibility tools researchers should be aware of to improve their workflow are the OSF, GitHub, dynamic documents with R Studio, and Dataverse. This is way too much to cover in two hours, so I ended up focusing on the OSF and skipping GitHub and dynamic documents entirely. I think the researchers came away excited to use the OSF’s built-in version control, which is a great step toward reproducibility.
At BITSS we’re looking to turn these slides and workshops into easily shared, widely re-used documents, so if you have any comments, we would love suggestions. If you’re a researcher associated with a developing country group, we are also looking to continue our outreach, so please contact us if you are interested in having BITSS present similar trainings to your organization.