by Alex Grossman
In 1955, Robert “Bob” Rosenthal was a student in the Psychology department at the University of California, Los Angeles (he is now on the faculty at UC Riverside). He spent his days thinking about “psychological projection,” a theory about how we see our own experiences in others. So he designed a study. It tested whether individuals who had just undergone an experience of failure (or success) were more likely to perceive failure (or success) in the faces of other people.
Rosenthal added an unnecessary step to his research. Instead of only looking at the difference between pre-test and post-test scores – which tended to support his hypothesis – he also conducted a statistical analysis of just the initial pre-test (baseline) data.
His curiosity would change his career forever.
Rosenthal found statistically significant differences between the test subjects at baseline, before he had even introduced the treatment – differences that ultimately biased the results of his study. That is, he found through analyzing the data that some participants were predisposed to behave as he predicted they would. He recalls, “[A]fter I had unnecessarily, playfully, and compulsively re-analyzed statistically the data of my UCLA doctoral dissertation… that reanalysis suggested strongly that my hypothesis or expectation about how the subjects should respond had somehow been communicated to the subjects so that my hypothesis might have become a self-fulfilling prophecy.” Rosenthal knew he couldn’t be alone, and that this type of experimenter bias may have been ruining data in the social sciences for years.
These were important findings, and his experience one day in 1960 was just as telling. On that particular day, Rosenthal and colleagues were walking to lunch and picked up their mail on the way. Rosenthal received two letters. In the one hand, a rejection letter from the journal where he submitted his paper on experimenter bias. In the other, a notification that he was going to receive the American Association for the Advancement of Science’s Socio-Psychological Prize for the very same paper. While publication in a major journal is a highly sought-after achievement for researchers, the contradiction Rosenthal faced highlighted that perhaps not all that is meaningful is published – and if it’s not published, then what might be lost that could push knowledge and the field forward?
This question, influenced by many more experiences than just those of that day in 1960, influenced Rosenthal’s work on what would become the most cited paper of his career and among the most influential for the social sciences: “The File Drawer Problem and Tolerance for Null Results”. Since studies that are typically published are those with findings that are statistically significant (results that, statistically, cannot be attributed to chance), researchers often end up with an entire file drawer full of studies that go unpublished due to a lack of statistical significance. His 1979 paper called attention to this problem, as well as to the fact that, if these file drawers were opened, combining results across studies could in fact lead to statistically significant evidence (See Ambady and Rosenthal 1991). His thinking and leadership in this area led to an increased attention to and use of meta-analysis, a statistical approach now employed throughout the sciences to combine the findings of studies done independently of one another.
A pioneer in the field of research transparency, Rosenthal continues to advocate for clear research questions, followed by clear research processes. Not only does he want to know how researchers approach their research and data collection, he wants to see increased access to their file drawers.
It is in the spirit of what Rosenthal, and many others, advocate for in the social sciences that the Berkeley Initiative for Transparency in the Social Sciences (BITSS) was inspired to launch the Leamer-Rosenthal Prizes. These prizes are intended to acknowledge work conducted by early-career researchers and mid-term career faculty to advocate for and implement research transparency methods in their work in order to strengthen the quality of research design, implementation, and dissemination.
Given that today we have the tools to start tackling problems identified by Rosenthal and others, such as pre-registering a study with its pre-analysis plan before the study is published and making public the data underlying studies (using data libraries such as Dataverse), there is an enormous opportunity to start tackling the big challenges that have beset the quality of social science research, such as p-hacking (a type of data manipulation that is not always done purposely), publication bias, and sometimes, outright fraud.
Of course these opportunities also present risk. Certainly in the short-term, scientists who embrace a culture of transparency will leave themselves more open to scrutiny than those who don’t. However, organizations like BITSS are helping direct academics and researchers toward a shift in accepted social science research practices and a culture that embraces transparency for what it can do to move our understanding of what works, what doesn’t, and why. We hope to do this by building consensus on transparency practices and standards, improving access to and the supply of transparency tools, and introducing incentives, like the Leamer-Rosenthal Prizes, to motivate researcher behavior change.
In the words of Dr. Rosenthal, “Science never stops.” We are certainly seeing this in action in the debates and conversations that emerge when leaders in the social science community open up their research. We hope the larger social science research community joins BITSS in acknowledging and celebrating the next wave of pioneers in transparency by submitting nominations for Leamer-Rosenthal Prizes!
Nominations are due September 30, 2015.