Read it, understand it, believe it, use it

“Principles and proposals for a more credible research publication”, an early draft white paper on best practices for social science journals, by Don Green, Macartan Humphreys, and Jenny Smith. Abstract: In recent years concerns have been raised that second-rate norms for analysis, reporting, and data access limit the gains that should follow from first-rate…

Git/GitHub, Transparency, and Legitimacy in Quantitative Research

Reblogged from The Political Methodologist. A complete research project hosted on GitHub is reproducible and transparent by default in a more comprehensive manner than a typical journal mandated replication archive […] Maintaining your research project on GitHub confers advantages beyond the social desireability of the practice and the the technical benefits of using…

Welcome To The Era of Big Replication

Reblogged from Ed Young: Psychologists have been sailing through some pretty troubled waters of late. They’ve faced several cases of fraud, high-profile failures to repeat the results of classic experiments, and debates about commonly used methods that are recipes for sexy but misleading results. The critics, many of whom are psychologists themselves,…

New Open Access, Editable Book on Open Science

“Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing”, a new editable book by Sönke Bartling and Sascha Friesike. Modern information and communication technologies, together with a cultural upheaval within the research community, have profoundly changed research in nearly every aspect. Ranging from sharing and discussing ideas…

Open Data Training Course

The Open Knowledge Foundation is organizing an introductory course on open data on Friday, December 6 in London. This one-day workshop is oriented towards organisations considering starting their own open data initiative. Topics to be covered include the benefits of opening data, regulatory requirements, data licensing, data quality and formats, planning an…

The New Statistics: A Pathway to Research Integrity

An eight-step strategy to increase the integrity and credibility of social science research using the new statistics, by Geoff Cumming. We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research…

New Standards for Research Reporting in Psychology

Psychological Science, the flagship journal of the Association for Psychological Science (APS), is introducing innovative new guidelines for authors, part of an effort to strengthen the reporting and analysis of findings in psychological research. Starting January 1, 2014, submitting authors will be required to state that they have disclosed all important methodological details,…

Research Transparency Landscape

A landscape of funder data access policies and other resources, by Stephanie Wykstra. New technology makes sharing research outputs– not just publications but also raw data, code, software, even lab notebooks – easier than ever before. The benefits from more open science are widely acknowledged. Yet there is still room for improvement:…

The Folly of Powering Replications Based on Observed Effect Size

Uri Simonsohn on replications: It is common for researchers running replications to set their sample size assuming the effect size the original researchers got is correct. So if the original study found an effect-size of d=.73, the replicator assumes the true effect is d=.73, and sets sample size so as to have…

Too Much Trusting, Not Enough Verifying

This week in The Economist: Too many of the findings that fill the academic ether are the result of shoddy experiments or poor analysis […] One reason is the competitiveness of science […] The obligation to “publish or perish” has come to rule over academic life. Competition for jobs is cut-throat […] Nowadays verification (the…

Trying out the new Trial Registries

Reblogged from World Bank’s David McKenzie: Both the American Economic Association and 3ie have launched Impact Evaluation Trial Registries […] I recently tried out both registries by registering a couple of studies I have underway, so thought I’d share some feedback on the process for those of you wondering whether/how to register. Read…

Changes in the Research Process Must Come From the Scientific Community

In a recent article intended to be published in a major policy journal, Victoria Stodden urges the scientific community to take the lead in establishing a new framework for more transparent research practices. While recent policy changes by the US government regarding public access to data and publications from federally funded research can…

Let’s Go Fishing

An interesting piece on p-fishing and what we can do about it.

The Imperative to Share Complete Replication Files

“Good research involves publishing complete replication files, making every step of research as explicit and reproducible as is practical.” This is the conclusion from a new paper by political scientist Allan Dafoe (Yale University). Dafoe examines the availability of replication data in political science journals, and concludes that “for the majority of published statistical analyses, […]…

New Registry for Impact Evaluations in International Development

The 3ie Registry for International Development Impact Evaluations (RIDIE) is a registry of impact evaluations related to development in low and middle income countries. The purpose of the registry is to enhance the transparency and quality of evaluation research as well as to provide a repository of impact evaluation studies for researchers, funders, and…

Bias Minimization Lessons from Medicine – How We Are Leaving a $100 Bill on the Ground

By Alex Eble (Brown University), Peter Boone (Effective Intervention), and Diana Elbourne (University of London) The randomized controlled trial (RCT) now has pride of place in much applied work in economics and other social sciences. Economists increasingly use the RCT as a primary method of investigation, and aid agencies such as the World…

AEA RCT Registry Webinar This Friday

The American Economic Association’s RCT Registry is a registration tool for pre-analysis plans of Randomized Controlled Trials in economics and other social sciences. The Abdul Latif Jameel Poverty Action Lab (J-PAL) will be hosting a brown bag webcast this Friday, September 20th at 1pm (EDT) to go over the motivations behind the registry and…

The Role of Failure in Promoting Transparency

By Carson Christiano (CEGA) You may wonder why a network of development researchers is taking the lead on a transparency initiative. The answer lies in the profound and omnipresent power of failure. Most would agree that risk-taking is essential to innovation, whether we’re talking about creating a simple hand-washing station or a…

Research Transparency in the Natural Sciences: What can we learn?

By Temina Madon (CEGA, UC Berkeley) As we all know, experimentation in the natural sciences far predates the use of randomized, controlled trials (RCTs) in medicine and the social sciences; some of the earliest controlled experiments were conducted in the 1920s by RA Fisher, an agricultural scientist evaluating new crop varieties across…

Transparency-Inducing Institutions and Legitimacy

By Kevin M. Esterling (Political Science, UC Riverside) Whenever I discuss the idea of hypothesis preregistration with colleagues in political science and in psychology, the reactions I get typically range from resistance to outright hostility. These colleagues obviously understand the limitations of research founded on false-positives and data over-fitting. They are even…

The Need for Pre-Analysis: First Things First

By Richard Sedlmayr (Philanthropic Advisor) When we picture a desperate student running endless tests on his dataset until some feeble point finally meets statistical reporting conventions, we are quick to dismiss the results. But the underlying issue is ubiquitous: it is hard to analyze data without getting caught in a hypothesis drift,…

Freedom! Pre-Analysis Plans and Complex Analysis

By Gabriel Lenz (UC Berkeley) Like many researchers, I worry constantly about whether findings are true or merely the result of a process variously called data mining, fishing, capitalizing on chance, or p-hacking. Since academics face extraordinary incentives to produce novel results, many suspect that “torturing the data until it speaks” is…

Transparency and Pre-Analysis Plans: Lessons from Public Health

By David Laitin (Political Science, Stanford) My claim in this blog entry is that political science will remain principally an observation-based discipline and that our core principles of establishing findings as significant should consequently be based upon best practices in observational research. This is not to deny that there is an expanding…

Targeted Learning from Data: Valid Statistical Inference Using Data Adaptive Methods

By Maya Petersen, Alan Hubbard, and Mark van der Laan (Public Health, UC Berkeley) Statistics provide a powerful tool for learning about the world, in part because they allow us to quantify uncertainty and control how often we falsely reject null hypotheses. Pre-specified study designs, including analysis plans, ensure that we understand…

Monkey Business

By Macartan Humphreys (Political Science, Columbia & EGAP) I am sold on the idea of research registration. Two things convinced me. First I have been teaching courses in which each week we try to replicate prominent results produced by political scientists and economists working on the political economy of development. I advise…

Bayes’ Rule and the Paradox of Pre-Registration of RCTs

By Donald P. Green (Political Science, Columbia) Not long ago, I attended a talk at which the presenter described the results of a large, well-crafted experiment. His results indicated that the average treatment effect was close to zero, with a small standard error. Later in the talk, however, the speaker revealed that…

An Open Discussion on Promoting Transparency in Social Science Research

By Edward Miguel (Economics, UC Berkeley) This CEGA Blog Forum builds on a seminal research meeting held at the University of California, Berkeley on December 7, 2012. The goal was to bring together a select interdisciplinary group of scholars – from biostatistics, economics, political science and psychology – with a shared interest…