Towards Transparency in the Social Sciences: an Eye-opening Experience at Berkeley

daoust web As a Fulbright PhD student in development economics from Brussels, my experience this past year on the Berkeley campus has been eye-opening. In particular, I discovered a new movement toward improving the standards of openness integrity in economics, political science, psychology, and related disciplines being lead by Berkeley Initiative for Transparency in the Social Sciences (BITSS). When I first discovered BITSS, it struck me how little I knew about research on research in the social sciences, the pervasiveness of fraud in science in general (from data cleaning and specification searching to faking data), and the basic lack of on consensus on what is the right and wrong way to do research. These issues are essential, yet too often they are left by the wayside. Transparency, reproducibility, replicability, integrity are the building blocks of scientific research. Willing to dig deeper, I followed Prof. Edward Miguel’s class (available on YouTube) on Transparency in Research. This was a first in the social sciences. The course is mostly devoted to development topics, but it gathered economists, epidemiologists, political scientists, psychologists etc. When I joined the class, I was finishing writing my thesis, and I envied the students starting their PhD having enrolled in such a course. If they can implement what they learned, they would “get to the right answer, not get an answer“. This is the fundamental goal of research, which has gone wrong in many ways in the last years. I once heard that it is more time consuming and harder to write a paper with bad data, and having to endlessly fixing things, than to collect it right the first time… It may seem obvious, but many researchers end up thinking too quickly about their survey design, collect data and then realize either that they asked (a lot) irrelevant questions or lack power, etc. Most data flaws could have been avoided by spending more time on the design. All departments do not have millions of dollars to collect data or conduct experiments, but as long as budgeting is common concern, it should be properly taken into account, or pooled with other research centers. Being transparent starts with elaborating a research plan. From there, it does not require much effort to write it and register it as a pre-analysis plan (see this BITSS post or the World Bank check list). Using similar logic, a professor once told me that I could have my own hand-written note at the exam – and could literally copy the book – because when someone write something down, that process forces them to stop and think clearly about what she is writing. Most of us can’t write something down that we did not understand, or does not make sense. I think this also applies to research. You will not register a plan that is doomed to fail. For a lot of questions, you do not necessarily need to go collect your own data. We all tend to forget that there are many datasets in the public domain or authors that are willing to share. These are more and more geo-referenced and can be easily merged with other publicly available data. For these, it is harder to build credibility from a pre-analysis plan, but nothing prevents you to do it for yourself and co-authors, and stick to it. It is our job to share methodologically sound practices that help to answer important questions. It is our job to be open to discussion, debate and disagreement. You do not want to a head by cheating. The satisfaction you will get will be much higher if you have done things right (which by now should be an objective concept). Don’t let practicality get in the way of rigor. If you are a social scientist and would like to embrace this movement, here is a longer version of this post, which includes some of the tools at our disposal to facilitate the process of making research more transparent and reproducible. Olivia D’Aoust 2014-2015 Fulbright Grantee]]>

BACK TO TOP