The BITSS Study

There is a growing consensus on the need for greater openness, integrity, and reproducibility in social science research. However, there is still limited empirical evidence on the extent of the problem; how it differs across disciplines, institutions and age cohorts; and how transparency norms and practices are evolving over time.

The principal aim of the proposed BITSS Study is to start filling in these gaps. We propose to measure awareness of, stated support for, and adherence to a wide range of emerging transparency practices in social science research (including data sharing, use of study registries and pre-analysis plans, and reproducible coding practices – among others). As we detail below, we intend to collect these measures among scholars in the four largest social science research disciplines – Economics, Political Science, Psychology, and Sociology.

Using financial incentives to boost survey response rates, we aim to collect samples that are broadly representative of the research community in these fields, across different types of institutions (i.e., public, private, government) and age cohorts. The study will be longitudinal, with annual follow-up surveys of the same individuals over multiple years, to allow us to capture the evolution of knowledge and practices over time. We will also cross-match survey responses with the researchers’ actual behavior, as assessed by their observed use of public study registries, data repositories, OSF, Github and other common transparency tools, to overcome the bias and errors inherent in conventional self-reported data.

While our hypothesis is that transparency practices are spreading in the social sciences, the speed and precise nature of this diffusion, and its concentration in certain fields, institutions, regions or cohorts, remains largely a matter of speculation. Given its role as a leading “hub” for research transparency in the social sciences, BITSS is well-positioned to lead a large-scale study of this sort.

The study has a second principal aim. By creating an ongoing panel sample of thousands of active social science researchers, the proposed BITSS Study will also provide a sampling frame for randomized interventions that aim to accelerate the adoption of transparency research practices, for instance, information campaigns or incentives that boost adoption of new software platforms (e.g., OSF), use of pre-analysis plans, posting of data in public repositories, or other practices.

These two aims are highly complementary: the longitudinal data collected in the BITSS Study will highlight the areas where there is greatest “need” or opportunity for improved practices, informing the design of these interventions. For instance, if data sharing is widespread in one academic discipline (i.e., Economics) but not in another (Sociology) there may be greater potential impact for an information intervention that highlights the benefits of using Dataverse in Sociology.

Examples of low-cost, scalable interventions might include the presentation of disclosure checklists or statements by journals to nudge authors toward desired behaviors; the introduction of (or greater publicity for) badges to recognize desired behaviors within a research community; and incentives to participate in BITSS or other research transparency trainings and activities. By tracking participants over time through the annual survey, and tracking their actual behavior (i.e., use of community registries and repositories, publication records, and collaborator networks) we can assess the impacts of these and other interventions.

We plan to collect data from a representative sample of researchers at different stages of the professional career, and across disciplines. Perhaps the central challenge to validity in the proposed study is achieving a sufficiently high response rate. Response rates may be low in the absence of financial incentives. While this needs to be pilot-tested, we assume an incentive of $20-$40 per respondent per round will be required to achieve a roughly 70% response rate, which will allow us to achieve a broadly representative sample for the analysis. We intend to limit the time length of the survey to no more than 10-12 minutes in order to limit attrition.

 

The funding needed to implement the BITSS Study will depend on the size of the sample, the necessary incentives, and the richness of the survey and other data collection activities, i.e., tracking respondent use of common study registries, data repositories, OSF, etc. A rough estimate is that achieving a sample size like the one proposed above will require roughly $325,000 per year, with much of the funding going to either research staff at BITSS or financial incentives for respondents to fill in the online surveys. A three (3) year initial study period would provide sufficient data to speak meaningfully about trends over time, and across fields, leading to an estimated initial funding request of $975,000.

Implementing interventions that aim to promote transparency practices will require additional funding, including additional staff research time and potentially financial incentives for particular behaviors. Depending on the nature of the intervention, we estimate each such study could cost between roughly $40,000-$100,000. One potentially attractive approach to designing these interventions would be to introduce a competitive mechanism, perhaps as an extension of the SSMART grants (implemented by BITSS and funded by the Arnold Foundation, see http://bitss.org/ssmart). Scholars would apply with proposed interventions; proposals would be evaluated by a BITSS review committee, and the envelope of funding would be allocated across the strongest proposals. These winning proposals could then be implemented within the BITSS Study sample, in ways that do not interfere (across the various interventions).