By Garret Christensen (BITSS) BITSS just got back from the ASSA conference, the major annual gathering of economists. The conference largely serves to help new PhD economists find jobs, but there are of course sessions of research presentations, a media presence and sometimes big names like the Chair-of-the-Federal-Reserve in attendance. BITSS faculty…
Come Learn More About Research Transparency at ASSA/AEA
If you’re at the ASSA meetings in Boston this weekend, and you are interested in learning more about research transparency, then please stop by booth 127 in the exhibition hall to speak with BITSS and Center for Open Science representatives. Or you can attend our session Monday morning at 10:15am: “Promoting New…
This Monday at AEA2015: Transparency and Integrity in Economic Research Panel
This January 5th, 10.15am at the American Economic Association Annual Meeting in Boston, MA (Sheraton Hotel, Commonwealth Room). Session: Promoting New Norms for Transparency and Integrity in Economic Research Presiding: Edward Miguel (UC Berkeley) Panelists: Brian Nosek (University of Virginia): “Scientific Utopia: Improving Openness and Reproducibility in Scientific Research” Richard Ball (Haverford College): “Replicability…
Scientists Have a Sharing Problem
Dec 15th Maggie Puniewska posted an article in the Atlantic Magazine summarizing the obstacles preventing researchers from sharing their data. The article asks if “science has traditionally been a field that prizes collaboration […] then why [are] so many scientists stingy with their information.” Puniewska outlines the most cited reasons scientists reframe…
Reflections on Two Years Promoting Transparency in Research
By Guillaume Kroll (CEGA) Two years ago, in December 2012, a handful of researchers convened in Berkeley to discuss emerging strategies to increase openness and transparency in social science research. The group’s concerns followed a number of high-level cases of scientific misconduct and unethical practices, particularly in psychology (1,2). As researchers started to question the…
Tomorrow! BITSS Research Transparency Forum to Be Livestreamed
Can’t attend our Annual Meeting? Not to worry, our Public Conference (Thu, 1.30 PM – 5.00 PM Pacific Time) will be livestreamed. For those who will be joining us virtually, questions can be submitted for the Q&A panel session starting at 4:30 PM via Twitter using #bitss2014. All other participants who will be able to attend the Public…
Psychology’s Credibility Crisis
In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology. According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other…
Tools for Research Transparency: a Preview of Upcoming BITSS Training
By Garret Christensen (BITSS) What are the tools you use to make your research more transparent and reproducible? A lot of my time at BITSS has been spent working on a manual of best practices, and that has required me to familiarize myself with computing tools and resources that make transparent work easier.…
Scientific Irreproducibility and the Prospects of Meta-Research
A recent article from The Economist featuring John Ioannidis’ Meta-Research Innovation Center (METRICS), whose work to advance the credibility of research will be presented next week at the BITSS Annual Meeting. “Why most published research findings are false” is not, as the title of an academic paper, likely to win friends in the ivory tower. But it has certainly…
Facilitating Radical Change in Publication Standards: Overview of COS Meeting Part II
Originally posted on the Open Science Collaboration by Denny Borsboom This train won’t stop anytime soon. That’s what I kept thinking during the two-day sessions in Charlottesville, where a diverse array of scientific stakeholders worked hard to reach agreement on new journal standards for open and transparent scientific reporting. The aspired standards are intended…
Former BITSS Institute Participant Advocates for Replication in Brazil
Dalson Britto Figueiredo Filho, Adjunct Professor of Political Science at the Federal University of Pernambuco in Recife, Brazil, who attended the BITSS Summer Institute in June 2014, recently published a paper on the importance of replications in Revista Política Hoje. “The BITSS experience really changed my mind on how to do good science”, said Figueiredo Filho.…
Paper Presentations for Annual Meeting Confirmed!
With the 2014 Research Transparency Forum around the corner (Dec. 11-12), we are excited to announce the papers to be presented during Friday’s Research Seminar. After carefully reviewing over 30 competitive submissions, BITSS has selected 6 paper presentations: Neil Malhotra (Stanford University): “Publication Bias in the Social Sciences: Unlocking the File Drawer” Uri…
Creating Standards for Reproducible Research: Overview of COS Meeting
By Garret Christensen (BITSS) Representatives from BITSS (CEGA Faculty Director Ted Miguel, CEGA Executive Director Temina Madon, and BITSS Assistant Project Scientist Garret Christensen–that’s me) spent Monday and Tuesday of this week at a very interesting workshop at the Center for Open Science aimed at creating standards for promoting reproducible research in the social-behavioral…
What to Do If You Are Accused of P-Hacking
In a recent post on Data Colada, University of Pennsylvania Professor Uri Simonsohn discusses what do in the event you (a researcher) are accused of having altered your data to increase statistical significance. Simonsohn states: It has become more common to publicly speculate, upon noticing a paper with unusual analyses, that a reported finding was…
First Swedish Graduate Student Training in Transparency in the Social Sciences
Guest Post by Anja Tolonen (University of Gothenburg, Sweden) Seventeen excited graduate students in Economics met at the University of Gothenburg, a Monday in September, to initiate an ongoing discussion about transparency practices in Economics. The students came from all over the world: from Kenya, Romania, Hong Kong, Australia and Sweden of course. The initiative…
Scientific consensus has gotten a bad reputation—and it doesn’t deserve it
In a recent post, Senior science editor at Ars Technica, John Timmer defends the importance of consensus. Opening with the following quote from author Michael Crichton: Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator…
The 10 Things Every Grad Student Should Do
In a recent post on the Data Pub blog, Carly Strasser provides a useful transparency guide for newcomers to the world of empirical research. Below is an adapted version of that post. 1. Learn to code in some language. Any language. Strasser begins her list urging students to learn a programming language. As the limitations of…
Teaching Integrity in Empirical Research
Richard Ball (Economics Professor at Haverford College and presenter at the 2014 BITSS Summer Institute) and Norm Medeiros (Associate Librarian at Haverford College) in a recent interview appearing on the Library of Congress based blog The Signal, discussed Project TIER (Teaching Integrity in Empirical Research) and their experience educating students how to…
Reproducible Research: True or False?
Keynote speaker at the upcoming BITSS annual meeting John Ioannidis (Professor of Health Research and Policy at Stanford School of Medicine, and Co-Director of the Meta-Research Innovation Center) speaks at Google about its efforts to improve research designs standards and reproducibility in science. Ioannidis is the author of the 2005 highly influential paper Why Most Published Research Findings Are False,…
Can Greater Transparency Lead to Better Social Science?
In a recent article on the Monkey Cage, professors Mike Findley, Nathan Jensen, Edmund Malesky and Tom Pepinsky discuss publication bias, the “file drawer problem” and how a special issue of the journal Comparative Political Studies will help address these problems. Similar to a recent article by Brendan Nyhan, reposted on the BITSS blog, the university professors writing…
BITSS is on Twitter!
BITSS has expanded its online media presence with a new Twitter account. Keep up to date with us and the world of research transparency by following @ucbitss.
Reminder: Call for Papers Deadline is October 10th
Papers or long abstract for the Call for Papers on Research Transparency must be submitted by Friday, October 10th (11:59pm PST) through CEGA’s Submission Platform. Topics for papers include, but are not limited to: pre-registration and the use of pre-analysis plans; disclosure and transparent reporting; replicability and reproducibility; data sharing; and methods for detecting and reducing…
MCC’s First Open Data Challenge
The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations. Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley…
To Get More Out of Science, Show the Rejected Research
In a recent opinion piece on the New York Times news portal the Upshot, Brendan Nyhan, an assistant professor of government at Dartmouth College, comments on a host of transparency related issues. Closely echoing the mission of BITSS, Nyhan identifies the potential of research transparency to improve the rigor and ultimately the benefits…
Africa’s Data Revolution – Amanda Glassman
Interview originally posted on the Global Poverty Wonkcast: Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me…
Can Post-Publication Peer-Review Increase Research Transparency?
Guest Post by Liz Allen (ScienceOpen) For the 3rd annual conference of The Berkeley Initiative for Transparency in the Social Sciences (BITSS), ScienceOpen, the new Open Access (OA) research + publishing network, would like prospective and registered attendees to consider the role that Post-Publication Peer Review (PPPP) can play in increasing the transparency…
COS Now Offering Free Consulting Services
A close partner of BITSS, the Center for Open Science (COS) has launched a free consulting service to anyone seeking help with “statistical and methodological questions related to reproducible practices, research design, data analysis, and data management.” The Center is dedicated to increasing the “openness, integrity, and reproducibility of scientific research” and…
Announcing The 2014 Research Transparency Forum
BITSS is pleased to announce its 3rd annual meeting (December 11-12 – Berkeley, CA). This year’s research transparency meeting will be the first to be open to the public and is anticipated to be the largest BITSS event to date. The event will act to update the academic community of the growing movement for greater…
White House Calls for Comments on Reproducible Research
The White House’s Office of Science and Technology Policy (OSTP) has released a request for information on improving the reproducibility of federally funded scientific research. Given recent evidence of the irreproducibility of a surprising number of published scientific findings, how can the Federal Government leverage its role as a significant funder of scientific research…
Political Scientists Launch New Replication Initiative
Following a groundswell of interest for replications in the political sciences, first noticed from survey results posted on the Monkey Cage Blog, Political Scientists Seth Werfel (Stanford University) and Nicole Janz (Cambridge University), and research consultant Stephanie Wykstra launched the Political Science Replication Initiative, a new repository for uploading study replications. Increasingly, methodological political scientists have recognized…
New Study Sheds Light on File Drawer Problem
A new study recently published in Science provides striking insights into publication bias in the social sciences: Stanford political economist Neil Malhotra and two of his graduate students examined every study since 2002 that was funded by a competitive grants program called TESS (Time-sharing Experiments for the Social Sciences). TESS allows scientists…
eLife, the Center for Open Science, and Science Exchange partner to assess the reproducibility of cancer biology research
eLife will be the publisher for the results of the Reproducibility Project: Cancer Biology, an effort led by the Center for Open Science and Science Exchange. First announced in October 2013, with $1.3 million in funding from the Laura and John Arnold Foundation, The Reproducibility Project: Cancer Biology aims to replicate key…
Call for Pre-analysis Plans of Observational Studies
Observational Studies is a peer-reviewed journal that publishes papers on all aspects of observational studies. Researchers from all fields that make use of observational studies are encouraged to submit papers. Observational Studies encourages submission of study protocols (pre-analysis plans) for observational studies. Before examining the outcomes that will form the basis for…
Job Opportunity in Data Curation/Publication
Innovations for Poverty Action (IPA) seeks a Research Analyst to join the Data Analysis/Data Publication team. This team is leading an innovative and exciting new part of IPA’s effort to promote high quality research: releasing research data from social science experiments publicly, for re-use and replication. The position also involves helping to develop a…
Call for Papers on Research Transparency
BITSS will be holding its 3rd annual conference at UC Berkeley on December 11-12, 2014. The goal of the meeting is to bring together leaders from academia, scholarly publishing, and policy to strengthen the standards of openness and integrity across social science disciplines. This Call for Papers focuses on work that elaborates new tools and strategies to increase the transparency and reproducibility of research. A committee of…
“Research misconduct accounts for a small percentage of total funding”: Study
Data Access and Research Transparency Panel @ APSA 2014
Join the BITSS co-sponsored panel Implementing Data Access and Research Transparency: Multiple Challenges, Multiple Perspectives at the upcoming meeting of the American Political Science Association (August 27-31, 2014 — Washington, DC). Chairs: Colin Elman, Syracuse University Arthur Lupia, University of Michigan, Ann Arbor
Data Science Meets Social Science (Video)
The video from a recent BITSS roundtable entitled “Data Science Meets Social Science” is now available online. Organized in partnership with the UC Berkeley D-Lab, the event brought together leading social scientists and Silicon Valley professionals to discuss pathways of collaboration between the two different fields, and their increasing impact on society in the…
Your Question for the Day — What Is “Peer Review”?
Significance Chasing in Research Practice
A new paper by Jennifer Ware and Marcus Munafò (University of Bristol, UK) Background and Aims The low reproducibility of findings within the scientific literature is a growing concern. This may be due to many findings being false positives which, in turn, can misdirect research effort and waste money. Methods We review factors that…
Privacy, Big Data, and the Public Good
Videos and presentations from the book launch of “Privacy, Big Data and the Public Good” (Lane, J., Stodden, V., Bender, S. & Nissenbaum, H. (Eds)) are now available online. Hosted by the NYU Center for Urban Science on July 16, the event included several panels with the book’s editors and a number of the authors. Overview of the book: Massive amounts of new data about people,…
Research Transparency & Open Knowledge: Lessons from #OKFest14
By Guillaume Kroll (CEGA) Over a thousand scientists, activists, and civil society representatives from over 60 countries gathered in Berlin last week for the 2014 Open Knowledge Festival (OKFest14). The Festival is the flagship event of the Open Knowledge Foundation, an international nonprofit promoting open tools, data, and information for the positive transformation of society. It’s a…
Science Establishes New Statistics Review Board
The journal Science is adding an additional step of statistical checks to its peer-review process in an effort to strengthen confidence in published study findings. From the July 4th edition of Science: […] Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis,…
How to Manipulate Peer Review and Get Your Paper Published
Another scandal of peer review abuse should urge academic journals to reconsider their publication requirements. This one comes from the Journal of Vibration and Control (JVC), an highly technical outlet in the field of acoustics, which just retracted 60 papers at once. The mass retraction followed the revelation of a “peer review ring” in…
Peer Review of Social Science Research in Global Health
A new working paper by Victoria Fan, Rachel Silverman, David Roodman, and William Savedoff at the Center for Global Development. Abstract In recent years, the interdisciplinary nature of global health has blurred the lines between medicine and social science. As medical journals publish non-experimental research articles on social policies or macro-level interventions, controversies…
Replication in Economics Database
For scientific progress, it is pivotal to review research findings by independently replicating results, thus making the findings more reliable. However, in econometric research, it is not yet common practice to publish replication findings. Replication Wiki This wiki, developed by researchers at the University of Göttingen (Germany), compiles replications of empirical studies in economics.…
10 Things You Need to Know About…
Check out this new EGAP series: 10 Things You Need to Know About Causal Effects 10 Things You Need to Know About Randomization 10 Things You Need to Know About Statistical Power 10 Strategies to Figure Out if X Caused Y
Research Transparency, Data Access, and Data Citation: A Call to Action for Scholarly Publications
This collaborative statement calls upon the scholarly publishing community to take leadership in advancing knowledge through research transparency, data access, and data citation. Please consider adding your name to the endorsements page and encourage others to do the same. This Call to Action was produced at the “Data Citation and Research Transparency Standards for the Social Sciences”…
Summer Institute Material Now Available
All the material from our summer institute in transparency practices for empirical research is now accessible on our training page. This weeklong workshop provides an overview of the latest trends in the shift towards increased transparency, combining presentations on conceptual issues in current research practices with hands-on training on emerging tools and approaches…
Future Steps for Research in Dishonesty
The Economic Science Association, an experimental economics professional organization, recently organized a workshop about dishonesty research at the University of Copenhagen, Denmark. The panel was composed of Mike Norton (Harvard Business School), Johannes Abeler (Oxford University), Marco Piovesan (University of Copenhagen), and Roberto Weber (University of Zurich). The discussion focused on future directions for research in…