An interview with Megan Becker, University of Southern California
By Aleksandar Bogdanoski, BITSS
Replication underscores the importance of transparency in methods and data in research and is critical in ensuring that science is self-correcting. Though efforts for improved replicability led by many in the open science community, including BITSS, have focused mainly on quantitative research (e.g., the forthcoming Social Science Reproduction Platform), some push for the same values in qualitative research. One project, led by Megan Becker, an Assistant Professor of Political Science at the University of Southern California and a BITSS Catalyst, involved replicating a foundational paper on conflict and natural resources with her undergraduate students. The replication led to a follow-up project based on the replicated paper and provided important lessons for the reproducibility of Megan’s research. Below are excerpts from a conversation I had with her about the project. Learn more in this MetaArXiv preprint and this paper.
This post is also published on the CEGA Blog.
Aleks: Why did you decide to use replication as a teaching tool in your class?
Megan: We conducted the replication exercise as part of an intensive, month-long course titled “Ecological Security,” which I co-taught with my colleague Jonathan Markowitz. Our motivation was to teach students about natural resources and the political economy of conflict and demystify the research process for them. The exercise was the first actual research experience for many of them, so the replication exercise allowed them to learn about research with “training wheels.”
Transparency is part of methods instruction by default since whenever students do applied methods work, they are asked to show and document their work clearly. In my opinion, teaching qualitative methods should involve not only talking about the “why,” but also showing students the “how” of research. So far, we have done the exercise five times as part of this class, and I have also used a shorter version of the assignment in my “Introduction to Research Methods” course. Each time, we have learned something new and made improvements.
“ (…) [T]eaching qualitative methods should involve not only talking about the ‘why,’ but also showing students the ‘how’ of research.”
A: How did you select the paper to replicate?
M: We chose Michael Ross’s “How do Natural Resources Influence Civil War?” 2004 paper, which is a foundational work on the relationship between natural resource wealth and civil war. I know the paper very well, and since it’s a medium-N study (13 cases), it provides a lot of material to analyze. We didn’t have the replication materials when we chose the paper, though obviously, that would’ve made our job much easier. If I were to do this again, given the course’s relatively short duration, I would have considered that.
A: Not having access to the replication materials, such as the data and the codebook, you probably had to obtain them from the author. Did you learn anything that could be useful for others who may find themselves in a similar situation?
M: In general, I think that original authors may be more receptive and cooperative in replications conducted as part of coursework since such instances are focused on providing a learning opportunity for students.
We learned that constructive communication and transparency are key when communicating with the original authors. A key recommendation by BITSS Catalysts Nicole Janz and Jeremy Freese that we tried to follow was to work with the original authors and let them know that we’re replicating their work. Michael Ross was accommodating and cooperative—he shared all of his documentation and case notes. Since he’s also based in Los Angeles, we took the students to meet him in person (we did the project before the pandemic). They used that opportunity to ask him questions about his motivation and research process. I can’t speak for him, but I know that the students found this experience very rewarding. I would encourage other replicators to do the same. There are no excuses now that we have Zoom!
A: What was the result of your replication?
M: When I introduced the assignment to the students, I emphasized that this was not a “gotcha” mission and that their grade was not dependent on the result; the purpose was to engage in research. The students painstakingly reviewed all thirteen causal mechanisms from the paper and determined what counted as evidence in each mechanism. Multiple students reviewed each case.
The result turned out to be much more interesting than I expected! Though we found that the original paper was sound overall, we could only replicate ~76% of the case coding. However, the remaining 24% were such that they had implications for the paper’s overall conclusions. Some of the conclusions flipped, and others were less supported than before. For example, examining nine cases of “resource battles”, Ross found that cooperation between the state military and rebels decreased the conflict’s intensity in eight of the nine battles. However, when we replicated the analysis, though we found evidence of cooperation in eight of the nine battles, cooperation decreased the conflict’s intensity only in two instances. We verified the findings through several inter-coder reliability checks and consultations with Ross, which we believe should preserve the integrity of the replication and safeguard against critiques of cherry-picking or selective reporting.
We shared the results via a preprint on MetaArXiv, which includes as co-authors, my colleague Jonathan Markowitz and five of the students who participated in the class. It also led to a follow-up project, where we expanded the 13 cases from the original paper to include the universe of cases, which is 46. Through my research lab and the NSF Research Experiences for Undergraduates program for which I am the PI, 30 students from four institutions have now worked on the project. So it turned out to be a very worthwhile project to replicate!
“(…) [G]ood workflow documentation (…) is important for research transparency, but also makes the instructor’s job much easier.”
A: What advice would you offer to other instructors who want to use replication as part of their class?
M: Approach this as any teaching tool that you want to use in your class, asking what you want your students to learn from it. In our case, I wanted the students to learn about substantive issues in the discipline, get involved in the research process, and appreciate research transparency. We did this by asking them to create a codebook, examine the original paper for its use of transparency tools and practices, think about potential improvements, and develop expertise about a particular case covered in the study.
Another important consideration is good workflow documentation, which is essential for research transparency and makes the instructor’s job much easier. In this case, I asked the students to create a “Data Appendix” and a compendium of the sources they had referenced, highlighting the information that informed their coding choices. All of this took a lot of planning and was a significant upfront investment. However, I hope that as we shared our instruction materials online, other instructors can adapt and use them.
A: Were there any resources that you found particularly useful as you were preparing this exercise?
M: How I developed the exercise was influenced by efforts to improve transparency in political science, such as the Data Access and Research Transparency (DART) Statement and work by Arthur Lupia and Colin Elman on production and analytic transparency. I also found it very helpful to attend RT2 Los Angeles in 2018! I went to the training looking for how I could apply the knowledge to my teaching and left with a very helpful framework for talking to students about transparency, particularly those who are new to the research process.
Finally, I used many resources from Project TIER. Instructors can find many course syllabi, exercises, and project instructions that incorporate aspects of research transparency and reproducibility on their website.
“(…) [T]his project caused me to approach my own research differently because it challenged me to think about it from my students’ perspective and others who are new to academic research.”
A: What haven’t we asked about this project that we should have asked? (And how would you answer that question?)
M: What I found most surprising as we were conducting the replication exercise was just how helpful and meaningful this was for me as a faculty member and a researcher. First, as someone teaching research methods and running a research lab, it was inspiring to see the students get involved with research for the first time and embrace the experience. Second, this project caused me to approach my research differently because it challenged me to think about it from my students’ perspectives and others who are new to academic research. Being asked, “why do you do things like that?” made me realize that “that’s how things are done” is not a sufficient explanation. The exercise also opened up space for innovation on my end to make sure that things I do as a researcher are always transparent and reproducible!
Megan Becker is an Assistant (Teaching) Professor in the Department of Political Science and International Relations at the University of Southern California and PI of the NSF Research Experiences for Undergraduates Program “Data Science and the Political Economy of Security.” She is the 2017 recipient of the Craig L. Brians Award for Excellence in Undergraduate Research and Mentorship, awarded by the American Political Science Association, and was a 2019-20 Faculty Fellow with the Project on Teaching Integrity in Empirical Research. Her research interests include civil-military relations and national security and research methods education.