Welcome to the BITSS Blog

Announcing the 2024 Catalyst Grants for Advancing Transparent, Reproducible, and Ethical Research winners!

The Berkeley Initiative for Transparency in the Social Sciences (BITSS) is excited to announce the recipients of our 2024 round of competitively selected Catalyst grants. From January–June 2024, 8 Catalysts will carry out projects across 8 countries, ranging from holding research transparency trainings, translating materials into different languages and with added cultural…

BITSS Honored for Building the Next Generation of Open Science Advocates

Credit: Einstein Stiftung Berlin/Sebastian Semmer “We know that institutions matter: They transform the [scientific] dedication of individuals to the next generation,” remarked representatives of the Einstein Foundation Berlin as they awarded the Einstein Foundation Award for Promoting Quality in Research to BITSS on March 14. The award recognizes individual researchers, institutions, and…

The Gender Gap in Academic Criticism

Research shows that women are less likely to point out and penalize mistakes in science, and publish fewer comments and failed replications in scientific journals. What does this mean for the social sciences? BITSS Program Manager Grace Han interviews David Klinowski (Visiting Assistant Professor at the University of Pittsburgh Katz Graduate School…

BITSS Leadership Joins Call for Einstein Foundation Award Winners

Edward Miguel, BITSS Faculty Director and Professor of Economics at UC Berkeley, and Carson Christiano, Executive Director of the Center for Effective Global Action (CEGA), participated in a zoom call last month for winners of the 2023 Einstein Foundation Award for Promoting Quality in Research. During the call, the Miguel and Christiano,…

BITSS wins Einstein Foundation Institutional Award for Promoting Quality in Research

BITSS is thrilled to share that it has received the 2023 Einstein Foundation Award for Promoting Quality in Research in the Institutional category. The Einstein Foundation Berlin made the announcement in a statement on November 14, 2023. This annual award recognizes individual researchers, institutions, and early career researchers whose work helps to…

BITSS Catalysts Inspired to Publish on Research Ethics

The authors, Anna Josephson and Jeffrey Michler, credit BITSS textbook Transparent and Reproducible Social Science Research for launching their research ethics journey. (Note: this post is co-published with CEGA.) Acknowledging the surplus of data collected and analyzed by economists and social sciences researchers and the speed at which results are disseminated, Anna Josephson and…

BITSS Flagship Open Science Training Gets a Boost

Introduction from BITSS: Today on the BITSS blog, BITSS Communications Intern Brian Lee announces a new NIA grant that invests in open science and transparency efforts across the social sciences.  One of the biggest barriers to the widespread adoption of research transparency tools and practices—and ultimately the credibility of social science research—is…

Pre-Specification and Reproducibility Outside of Academia

Introduction from BITSS: In this post, Catalyst Ann Furbush shares her experience from her training project and discusses how open science tools and practices can be adopted in professional settings outside of academic research. Enjoy the read! Open science practices are not universally adopted in the social sciences and are often not…

Ensuring Reproducibility in Large Research Teams

Introduction from BITSS: Today on the BITSS blog, Thomas Brailey shares takeaways from his Catalyst training project which involved onboarding in reproducible workflows for members of the J-PAL Payments and Governance Research Program. Check out the training materials developed as part of the project and read on to learn more! Holding all else…

Promoting Transparency and Equity in Pre-Doctoral Research

by Coly Elhai, Dominic Russel, Jun Wong Introduction from BITSS: Today on the BITSS blog, Coly Elhai, Dominic Russel, and Jun Wong reflect on their Catalyst training project entitled “Transparency and Equity in Pre-Doctoral Research,” which featured a large online workshop on research transparency for pre-doctoral economics students. Full-time pre-doctoral research experiences…

Neglecting Null Results: What We Don’t Know Could Hurt Us

By Aleksandar Bogdanoski. This post is also posted on the CEGA blog. We share insights from a recent article published in the Proceedings of the National Academy of Sciences (PNAS), co-authored by CEGA/BITSS Faculty Co-Director Ted Miguel and several members of the BITSS community. The article discusses how researchers, funders, journal editors, and others can…

Introducing the Social Science Reproduction Platform, a resource for teaching and improving computational reproducibility

By Aleksandar Bogdanoski (Project Manager, BITSS), Fernando Hoces de la Guardia (Project Scientist, BITSS), Edward Miguel (Faculty Director, CEGA/BITSS), and Lars Vilhuber (Executive Director, Labor Dynamics Institute, Cornell University, and Data Editor, American Economic Association). This post is also posted on the CEGA blog. Do you teach, or participate in, an empirical…

Using Open Policy Analysis to Fight Alternative Facts

By Fernando Hoces de la Guardia This post was originally published on the CEGA Blog on May 25, 2021. The emergence of “alternative facts” and post-truth politics is usually associated with the rise of populism in western democracies. While there is an element of truth to this association, democratic governments have struggled…

Métodos em Pauta: an initiative bringing transparency to Brazilian political science

Introduction from BITSS: In addition to being synonymous with research transparency and reproducibility, open science is also about building communities centered around collaboration and the exchange of knowledge. In this post, Catalysts Amanda Domingos and Rodrigo Lins share their experience establishing and leading Métodos em Pauta [Methods on the Agenda], a student-led…

Emerging benefits and insights from a year of forecasting on the Social Science Prediction Platform

By Katie Hoeberling, Senior Program Manager The Social Science Prediction Platform (SSPP) allows researchers to systematically collect predictions about the results of research, as discussed in this post from last July. When faced with important questions and scarce resources, policymakers and practitioners rely on research and expert perspectives to make decisions (often…

What We Can Learn through Replication in Qualitative Research

An interview with Megan Becker, University of Southern California By Aleksandar Bogdanoski, BITSS Replication underscores the importance of transparency in methods and data in research and is critical in ensuring that science is self-correcting. Though efforts for improved replicability led by many in the open science community, including BITSS, have focused mainly…

An Open Policy Analysis for Deworming Interventions

An illustration of how the policy community can use OPA to strengthen the evidence-to-policy link.  By Fernando Hoces de la Guardia What is the deworming OPA? Opacity in policy analysis presents barriers to applying rigorously generated evidence to novel settings. Open Policy Analysis (OPA) is a framework to improve the transparency and…

Peer Community In Meta-Research: Community peer review

By Chris Hartgerink (Liberate Science, GmbH) Introduction from BITSS: BITSS launched MetaArXiv (formerly BITSS Preprints) in 2017 as a preprint service where researchers could share and discover work focused on improving research transparency and reproducibility and other meta-scientific research. We recently teamed up with the Peer Community in (PCI) Meta-Research to give MetaArXiv…

Internal replication: another tool for the reproducibility toolkit

By Jade Benjamin-Chung (University of California, Berkeley) and Benjamin F. Arnold (University of California, San Francisco) Introduction from BITSS: Internal replication is a new tool in the reproducibility toolkit with which original study investigators replicate findings prior to submission to a peer-reviewed journal. Jade Benjamin-Chung (UC Berkeley) and Benjamin Arnold (UCSF) describe…

Announcing the launch of the Social Science Prediction Platform!

By Aleksandar Bogdanoski (BITSS Senior Program Associate) and Katie Hoeberling (BITSS Program Manager). This post is also published on the Development Impact blog and the CEGA blog. What will be the effect of raising the minimum wage by $2 an hour? How will cash transfers impact local prices in rural Kenya? Will…

Why COVID-19 makes research transparency more important than ever

By Aleksandar Bogdanoski, Katie Hoeberling, and Fernando Hoces de la Guardia This post is also published on the CEGA blog. The COVID-19 pandemic has arguably had a more profound impact on scientific research than any other phenomenon in recent history; between January and May this year, over 23,000 research papers were written…

Open Scholarship Week 2020: Embracing openness in Ireland and beyond

by Hardy Schwamm (NUI Galway) and Elaine Toomey (University of Limerick/Cochrane Ireland)  Introduction from BITSS: In what is becoming an annual event, the Open Science Community Galway at the National University of Ireland Galway (NUI Galway) hosted Open Scholarship Week 2020, featuring a series of workshops, seminars, and presentations dedicated to transparent,…

The 2019 BITSS Annual Meeting: A barometer for the evolving open science movement

By Aleksandar Bogdanoski and Katie Hoeberling Each year we look forward to our Annual Meeting as a space for showcasing new meta-research and discussing progress in the movement for research transparency. These meetings offer snapshots of the evolving priorities and challenges faced by the scientific community working to change research norms. BITSS’s…

New resources for de-identifying and publishing research data from J-PAL

Introduction from BITSS: Open data has many benefits. It can foster collaboration, facilitate more complete meta-analysis, and improve the visibility of related research outputs. At the same time, we know that re-identification can cause real risks to study participants and that balancing openness with such risks is a delicate and often difficult…

What development economists talk about when they talk about reproducibility…

Development economists, by and large, are driven to inform effective policy by generating rigorous and credible evidence. So in the face of failures to replicate findings, as well as evidence of pervasive publication bias across the social sciences, it’s no surprise that work to increase research transparency and reproducibility is underway in…

Transparent and Reproducible Social Science Research: A new open science textbook

It’s been nearly 15 years since John Ioannidis’s “Why Most Published Research Findings are False” was published, turning the scientific community on its head. Today you might think little had changed. Just type “science is broken” into your online search engine, and you’ll find dozens of recent articles, blog posts, videos, and…

Pre-results Review at the Journal of Development Economics: Lessons learned so far

By Andrew Foster (Brown University), Dean Karlan (Northwestern University), Edward Miguel (UC Berkeley) and Aleksandar Bogdanoski (BITSS) This post was originally posted on the Development Impact blog. BITSS has been working with the Journal of Development Economics (JDE) to introduce Pre-results Review (also referred to as “registered reports” in other disciplines) as…

Pre-results review reaches the (economic) lab: Experimental Economics follows the Journal of Development Economics in piloting pre-results review

In its April 2019 issue, the journal Experimental Economics issued a Call for Submissions for a virtual Symposium of 5-7 papers to be published under “pre-results review”. BITSS Senior Program Associate Aleksandar Bogdanoski talked to Irenaeus Wolff of University of Konstanz, who along with Urs Fischbacher is a guest editor for the…

Better pre-analysis plans through design declaration and diagnosis

DeclareDesign is a set of software packages for planning for and assessing research designs, written by Graeme Blair, Jasper Cooper, Alexander Coppock, and Macartan Humphreys. The software is based on the Model-Inquiry-Data Strategy-Answer Strategy (MIDA) framework for declaring and diagnosing research designs introduced in their paper forthcoming at the American Political Science Review. In…

Opening up the analysis behind Elizabeth Warren’s wealth tax plan

Press release: BERKELEY, CA (Wednesday, March 13, 2019) — BITSS collaborated with UC Berkeley economists Emmanuel Saez and Gabriel Zucman to produce a fully reproducible version of their policy report for Sen. Elizabeth Warren’s wealth tax plan.

Three reasons in favor of transparent, reproducible, and ethical research practices

By Fernando Hoces de la Guardia (BITSS) and Sebastián Martínez (Inter-American Development Bank) This post is cross-posted on the IDB Impact blog and the CEGA blog. You can read it in Spanish here. Introduction from BITSS: This post highlights the results of a successful partnership between BITSS and the Inter-American Development Bank…

The Persistence of False Paradigms in Low-Power Sciences

By Pascal Michaillat*, Brown University It is commonly believed that the lack of experimental evidence typical in the social sciences slows but does not prevent the replacement of existing theories by newer, better ones. A simple model of scientific research and promotion challenges that belief, however. In the model, scientists are slightly…

The Future of Forecasting – Highlights from the BITSS Workshop on Forecasting Social Science Research Results

By Nicholas Otis, second-year PhD student in Health Economics at UC Berkeley This post is also published on the CEGA blog. Researchers are increasingly collecting forecasts of social science research results. For example, researchers have recently integrated predictions into studies examining questions like what are the long-term effects of a community-driven development…

Working toward a Common Rule for Transparent, Reproducible, and Ethical Research

by Jennifer Sturdy, BITSS Communalism, organized skepticism, disinterestedness, and universalism[1]. These scientific norms guide BITSS as an advocate for research transparency. From our perspective, research transparency is the means by which the research ecosystem operationalizes these scientific norms; and fully transparent research demonstrates when these norms are adhered to, and when they…

ReproducibiliTea: A Reproducibility-themed Journal Club

ReproducibiliTea started as a journal club at the University of Oxford and is now also a podcast co-hosted by Sam Parsons, Amy Orben, and Sophia Crüwell. As the name suggests, the leaders are focused on research reproducibility and subtopics including transparency and rigor. BITSS Catalyst Amy Riegelman interviewed Sam Parsons who responded…

Power to the Plan

By Clare Leaver, Owen Ozier, Pieter Serneels, and Andrew Zeitlin This post is also posted on the Development Impact blog. The holidays are upon us. You might like to show off a bit by preparing something special for the ones you love. Why not make a pre-analysis plan this holiday season? You’re…

Pre-results review at the Journal of Development Economics: Taking transparency in the discipline to the next level

This post, developed by Aleksandar Bogdanoski (Program Associate, BITSS) and Keesler Welch (Research Associate, J-PAL) with support from Anja Sautmann (Director of Research, Education, and Training, J-PAL), is also posted on the CEGA blog and J-PAL blog. Earlier this year, the Journal of Development Economics (JDE) began the pilot of a “pre-results…

A Great Day for Open Policy Analysis

Post by Fernando Hoces de la Guardia (BITSS Postdoc). How the best tweet of the year came not from Taylor Swift, or Barack Obama… but from CBO. Earlier this year, we wrote a blog post outlining our vision for bringing “Open Science” into policy analysis. The basic idea is simple: government policy analysts…

CONSORT-SPI 2018: Announcing an extension for randomized controlled trials of social and psychological interventions

Re-post by Paul Montgomery, Evan Mayo-Wilson, and Sean Grant Complete and transparent reporting of randomized controlled trials is integral for replication, critical appraisal and understanding context. Published today in Trials, a new extension of the CONSORT Statement aims to improve the reporting of randomized controlled trials of social and psychological interventions. Here,…

MetaLab Awards Three Contribution Challenge Prizes!

Guest announcement by Christina Bergmann and Sho Tsuji (MetaLab) The MetaLab challenge calling for meta-analyses on cognitive development, with support from Berkeley Initiative for Transparency in the Social Sciences (BITSS), has closed. We received data for 7 meta-analyses, which will be added to MetaLab in the coming months. The winners are three…

Research transparency in Sub-Saharan Africa: Lessons learned and ways forward

Guest post by Soazic Elise Wang Sonne (World Bank) Despite efforts by African governments to significantly raise public spending on scientific research, the continent, which is home to 14% of the world’s population, contributes to less than 1% of published research outputs (David Dunne, 2017). While this can be partly attributed to…

Interpretation of study results (Part 2/2): A reproducible method

Guest post by Arnaud Vaganay (Meta-Lab) This post is the second of two dedicated to the reproducible interpretation of empirical results in the social sciences. Read part 1 here. In my previous post on the interpretation of study results, I contrasted the notions of: Analytic reproducibility, which is concerned with the reproducibility…

Transparency and Trust in the Research Ecosystem

CEGA launched the Berkeley Initiative for Transparency in the Social Sciences (BITSS) based on the argument that more transparency in research could address underlying factors driving publication bias, unreliable research findings, a lack of reproducibility in the published literature, and a problematic incentive structure within the research ecosystem.  Meanwhile an ever-increasing number…