Articulating the problem and possible ways forward

Fortunately, there is a growing community of researchers attempting to understand the causes of the crisis of reproducibility and credibility and who are working together across disciplines to develop solutions. The Berkeley Initiative for Transparency in the Social Sciences, or BITSS, is just one part of this growing movement attempting to provide leadership and guidance for the scientific community.


Social scientists have begun promoting higher standards of transparency in order to improve both the quality and credibility of research.

With a predominating “dysfunctional reward structure” that incentivizes researchers to report data that is more publishable than accurate, it’s important to realign scholarly incentives with scholarly values, especially since policy decisions based on research impact the lives of billions of people.

In 2014, a few of my colleagues and I wrote an article for Science’s Policy Forum discussing this. In it, we explain:

“There is growing appreciation for the advantages of experimentation in the social sciences. Policy-relevant claims that in the past were backed by theoretical arguments and inconclusive correlations are now being investigated using more credible methods. Changes have been particularly pronounced in development economics, where hundreds of randomized trials have been carried out over the last decade.”

My colleagues – researchers leading the movement for research transparency and reproducibility – and I converge on three core practices: disclosureregistration and pre-analysis plans, and open data and materials.

1) Disclosure calls researchers to abide by reporting standards that require them to “document and disclose key details about the data collection and analysis”. This includes all information about measures, manipulations, data, exclusions, and how they arrived at final sample sizes.

2) Registration and Pre-analysis plans (or PAPs) credibly distinguish hypothesis testing from hypothesis generation and exploratory research. Pre-analysis plans can include documents specifying statistical models, variables, and testing corrections. We’ll get more into these in Week 3.

3) Having open data and materials allows other researchers to review, revise, and reproduce results to further assess a study’s external validity. We’ll learn more about open data in Week 4.

Along with the adoption of these steps, organizations exist to facilitate the movement towards transparent research. The Open Science Framework (or OSF) is a platform created by the Center for Open Science on which researchers can make their data and other research materials public, while similar organizations exist to archive randomized controlled trials (RCTs), pre-analysis plans or assist with study pre-registration.

The Berkeley Initiative for Transparency in the Social Sciences, or BITSS, was also established to provide tools and resources that promote and facilitate research transparency. Find some of these tools on our online resource library here. Garret Christensen, Jeremy Freese (a sociologist at Stanford University), and I are also finishing up a textbook in the next few months. The book goes into more depth about the topics we introduce in this course. If you’re interested in learning more, keep an eye out for “Transparent and Reproducible Social Science Research”!

With all of these new developments, social science researchers have the potential to provide more credible evidence to policy-makers and other decision-makers. Knowing this, what reservations might researchers have to adopting more transparent practices? If these three steps can significantly reduce fraud, why aren’t more people taking them?These questions may be easier to answer as you learn more about each topic, but they are good questions to keep in mind as you go through the course.

You can read the entirety of our paper by clicking on the link in the SEE ALSO section at the bottom of this page.


Recess

Also, if you have a spare 20 minutes, check out this clip from John Oliver’s Last Week Tonight on Scientific Studies. He takes a humorous, if slightly cynical, approach to the crisis of reproducibility and introduces a few concepts we’ll get into later.


References

Miguel, E., C. Camerer, K. Casey, J. Cohen, K. M. Esterling, A. Gerber, R. Glennerster, et al. 2014. “Promoting Transparency in Social Science Research.” Science 343 (6166): 30–31. doi:10.1126/science.1245317.