Some SSMART reflections on who is doing research on transparency and doing that research transparently!

BITSS is delighted with the steady release of the results from our SSMART-funded research portfolio, and with the incredible response to some of the projects.

It goes without saying that SSMART aimed to fund research on topics that would help fuel improved transparency and reproducibility in the social sciences – this meant we were one of the first sources for funding competitively selected research on transparency, at least when we started in 2015. You might think that attracts a certain kind of researcher, so we thought we’d share the basic metrics on WHO applied to and was funded under SSMART across the three rounds of selection from 2015-2017:

  • There were a total of 82 applications for $2.25 million submitted and within our three research categories. Of those, 39% were in category 1: methods, 33% were in category 2: meta-analysis, and 28% were in category 3: researcher practice.
  • With independent review committees, BITSS competitively selected 22 projects for a total of $598K – 27% were in category 1, 46% were in category 2, and 27% were in category 3.
  • The average year of PhD degree completion for SSMART principal investigators was 2007 for both applicants and selected projects.
  • For the main three disciplines, 29% of applications and 32% of selected projects came from psychology; 15% of applications and 14% of selected projects came from political science, and 25% of applications and 41% of selected projects came from economics. Given economics has lagged behind psychology and political science in other metrics – such as journal up-take of the TOP guidelines – we hope this is a sign of good things to come.
  • BITSS also aimed for gender balance across its funded projects, when possible. Reflecting that BITSS faces the same challenges discussed here, 30% of applications and 23% of selected projects were led by female principal investigators – a reminder that more work is needed to foster gender balance in our research efforts and open science more broadly!
  • While the majority of applications came from US, Europe, and UK-based researchers, the other hotbed for research on transparency seems to be coming from Australia and New Zealand with 10% of applications and 9% of selected projects.
  • Given that BITSS leveraged funding available to scale-up these efforts in the Global South, it’s worth noting that 11% of applications and 9% of selected projects came from principal or co-principal investigators from Sub-Saharan Africa.

Are these stats reflective of what’s happening across the social sciences? We’re not sure, but we’re developing a new research project to help answer that question – more to come soon!

Now, the savvy members of our community may notice a few things that we think are also worth reflecting on here because, after all, we weren’t just funding research ON transparency – we were funding TRANSPARENT research on transparency. This meant SSMART funding for empirical research was tied to specific deliverables intended to make the research funded under SSMART more transparent and reproducible – identified not only in the Request for Proposals, but also the grant Award Letters. Those deliverables were:

  1. Registration and Openness. SSMART grant recipients had to establish an account with the Open Science Framework (“OSF” http://osf.io), and make a new project page for the research funded by this grant. The OSF page needed to include a pre-analysis plan describing the hypothesis or hypotheses to be examined in the research study, the statistical model(s), and methodologies to be used. Following completion of a pre-analysis plan, grantees needed to pre-register using the “OSF-Standard Pre-Data Collection Registration Template”. In addition, data must be version controlled and shared on the study OSF webpage within six (6) months of the publication date of a final report, subject to IRB or other confidentiality agreements. All code used to analyze data used under this grant, as well as any final results, such as papers or reports, must also be posted on the OSF within one year of the end of data collection, unless otherwise agreed. That’s FIVE actions required for making their research transparent.
  2. Dissemination Events. Recipients also needed to present findings through at least one dissemination event.

For dissemination events, we’ve tried to keep up with all the action, and we’ve certainly seen SSMART presented at the 2015 and 2016 BITSS Annual Meetings, as well as the 2017 American Educational Research Association (AERA) Annual Meeting during a session entitled “Addressing Practical Concerns to Conducting a Meta-Analysis”; the 2017 Midwest Political Science Association (MPSA) Conference with a session titled “Improving the Presentation and Analysis of Statistical Models“; and the upcoming 2017 Western Economics Association International (WEAI) Annual Conference. Interested in having a SSMART lecture near you? We still have some funding available to support travel – let us know!

And, how did we fair on the other outcomes? Are researchers who conduct research on transparency making their research more transparent? Well, 2 projects were just awarded in May 2017, so we will exclude them from this analysis. For the other 20 SSMART projects:

  • 100% of the projects established a project page on OSF; and
  • 75% of projects have posted working or final papers on OSF that have been downloaded on average 42 times from OSF. Those with papers on the BITSS Preprints service have been downloaded an average of 155 times!

For the 17 of the 20 SSMART projects that are empirical research, our data tells us:

  • 88% posted pre-analysis plans that have been downloaded on average 51 times;
  • 76% of projects were pre-registered on the OSF; and
  • 18% of projects have posted data and/or analysis code – which is pretty amazing considering the research teams have six months from completion of their paper to publish the data and code!

This initial analysis suggests something we already consider – the more effort required (writing a PAP, pre-registering a study, publishing data and code), the harder it is to reach 100% compliance with these practices. It also tells us something else – pre-analysis plans can be useful for other researchers to reference – as we see an equal interest in downloading the pre-analysis plans!

So our results are mixed – but we plan to check in again in six months once the final deadlines for 2015 and 2016 SSMART projects arrives and we can fully take stock of where we are compared to where we wanted to be, and use this to inform future SSMART research competitions. Stay tuned!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.