Guest Post: Zacharie Tsala Dimbuene, University of Kinshasa, Democratic Republic of the Congo
Everything started in March 2016, when I attended a workshop in Athi River (Nairobi, Kenya) organized by Berkeley Initiative for Transparency in the Social Sciences about “Research Transparency in Social Sciences” (RTSS, hereafter). What caught my attention first is that all attendees, with the exception of myself, were Anglophones. I was happy to discover this new movement aimed at increasing the reliability of research, but at the same time, not happy that Francophone Africa is running behind. I couldn’t hesitate— I decided to become a BITSS Catalyst to spread RTSS principles and techniques.
In May 2016, I attended another workshop in Paris, France, where over 90% of attendees were Francophones. I used this opportunity to ask a simple question to the folks: “Have heard of RTSS?” The response was a big NO. At that time, I was in touch with BITSS team to sponsor a two-day RTSS workshop at University of Kinshasa, Democratic Republic of the Congo, one of the biggest Francophone countries in Africa and the world. A big thank you to the BITSS team for your generosity! The workshop went very well with an attendance of approximately 30 enthusiastic persons including professors, teaching assistants, and graduate students (PhD and Masters). The workshop covered the principles of RTSS, pre-analysis plans, and dynamic documents (e.g., using git and Markdoc in STATA).
The first day went very well and participants loved the discussion about research transparency. A highlight was when we discussed in small groups and in a plenary session any attempt to replicate/reproduce one’s findings using available datasets. Some participants mentioned that they lacked a clear understanding of the dataset or ignored the sampling design of the dataset. Interestingly, attendees pointed out that pieces of research are often a “black box”; therefore RTSS is promising to increase research reliability, which has many implications for policies, programmes, and interventions.
Another interesting debate was about “data mining”. The discussion started with “p-hacking”— the tendency to publish (or for journals to publish) only significant results. Attendees mentioned that they are attracted by the “p-values” because they want their pieces of research to be published. I jump on this occasion because recently my organization interviewed a researcher for a position. One of the attendees clearly mentioned that they do “data mining” to get statistically significant results, which are more likely to be published. Data mining is a reality in research. As this substantially reduces the quality of research, I am convinced that RTSS is the only way to reduce and/or eliminate this misbehavior.
The second day was a bit more technical and hard. Ideally attendees should have some STATA background, but there were attendees with no STATA prerequisites. To overcome this, I organized the attendees in small groups and asked them to nominate a leader, preferably someone with experience with STATA, to help them walk through the exercises with Markdoc and cleaning of STATA codes.
There were also some challenges, mainly related to organizational issues. First, the Internet connection was instable, which made it difficult to work with some applications like git. Fortunately, the off-line version helped. Second, the workshop was organized in an environment where people are not used to paying for workshops. Instead they expect some incentives from the organizer, which was very challenging.
In sum, Research Transparency in Social Sciences in Francophone Africa is embryonic. Although the workshop was very welcome, a lot still needs to be done to promote this punchy idea to open the “black box” of published papers where the secrets arise, grow, and die with the researchers. On my side, I will devote my time and energy to use RTSS in research and teaching strategies.