CONSORT originally aimed to address inadequate reporting of RCTs in medicine by identifying minimum standards for describing how RCTs were conducted and what they found. It has had a positive impact in medicine, and has since been extended to address a wide range of interventions and trial designs.
Unfortunately, compared with the medical sciences, implementation of CONSORT and other reporting guidelines has been lacking in the social and behavioral sciences.
As a result, important details are routinely missing from publications of trials evaluating the effects of social and psychological interventions. Most reports of RCTs of social and psychological interventions are not sufficiently comprehensive to replicate the interventions provided, appraise study quality, or understand for whom and under which circumstances results might apply. On top of this, the complex nature of social and psychological interventions often demands greater attention to intervention design, delivery, uptake, and context.
Recently, the conduct and reporting of social and psychological intervention research has been under scrutiny because of the “replication crisis” in the social sciences. A recent NIH policy requiring registration and results reporting for RCTs that weren’t previously considered “clinical trials” has led to great controversy.
Against this backdrop, social and behavioral scientists have taken steps to increase research transparency. For instance, there’s great enthusiasm for data sharing, and there are emerging standards for promoting an open research culture (e.g. here and here).
However, even as data, code, and study materials become widely available, journal articles are likely to remain the cornerstone of scientific discourse. Most of us don’t have the time, skills, or interest to reanalyze the studies we read. Similarly, registrations are of little value if we don’t have clear reports summarizing the results.
While researchers’ claims are far more believable when they’re based on pre-specified methods and backed by open data, the usefulness of each step depends on the others. That is, new data sharing and registration initiatives will have greatest value if we also have clear and comprehensive reports describing study methods and results.
Increasing trust in the social and behavioral sciences
Published today in Trials, we developed a CONSORT extension for Social and Psychological Interventions, CONSORT-SPI 2018, to improve reporting of these trials and thereby increase trust in the social and behavioral sciences.
CONSORT-SPI 2018 identifies the minimum information needed to understand and apply the results of RCTs that evaluate interventions thought to work through social and psychological mechanisms of action. To facilitate adherence to the CONSORT-SPI 2018 checklist, we also created an Explanation and Elaboration document that provides guidance tailored to concepts, theories, and taxonomies used in the social and behavioral sciences.
Ultimately, we hope to facilitate the pursuit of answering the “big question” of how and why interventions work, for whom, and under what conditions. To do so, we need to implement strategies to increase transparency and trust in the social and behavioral sciences, not just pay them lip service.
This week’s publication of CONSORT-SPI 2018 aims to move these efforts forward. With support from the research community, this guideline could help boost confidence in and use of our research among policymakers and, perhaps most importantly, the public at large.
We look forward to working with authors, editors, and other stakeholders to translate this new reporting guideline into policy and practice. We are also eager to hear from the research community about how CONSORT-SPI 2018 works in practice and how we might improve it in the future. We included the year “2018” in the title to reflect our commitment to update this guideline regularly. Please send us your feedback by email to email@example.com.
This post was orignally published on the BioMed Central On Medicine blog on July 31, 2018 here.