Who Inspired the Leamer-Rosenthal Prizes? Part II – Ed Leamer

Guest post by Edward Leamer, UCLA Professor of Economics & Statistics


I became interested in methodological issues as a University of Michigan graduate student from 1967 to 1970, watching the economics faculty build an econometric macro model in the basement of the building (The Michigan Model), and comparing how these same faculty members described what they were doing when they taught econometric theory on the top floor of the building.  Though the faculty in the basement and on the top floor to outward appearances were the very same people, ascending or descending the stairs seemed to alter their inner intellectual selves completely.

The words “specification search” in my 1978 book Specification Searches refers to the search for a model to summarize the data in the basement where the dirty work is done, while the theory of pristine inference taught on the top floor presumes the existence of the model before the data are observed. This assumption of a known model may work in an experimental setting in which there are both experimental controls and randomized treatments, but for the non-experimental data that economists routinely study, much of the effort is an exploratory search for a model, not estimation with a known and given model. The very wide model search that usually occurs renders econometric theory suspect at best, and possibly irrelevant.  Things like unbiased estimators, standard errors and t-statistics lose their meaning well before you get to your 100th trial model.

Looking at what was going on, it seemed to me essential to make theory and practice more compatible, by changing both practice and theory.   An essential but fortuitous accident in my intellectual life had me taking courses in Bayesian statistics in the Math Department. The Bayesian philosophy seemed to offer a logic that would explain the specification searches that were occurring in the basement and that were invalidating the econometric theory taught in the top floor, and also a way of bringing the two floors closer together.

The fundamental message of the Bayesian approach is that, when the data are weak, the context matters, or more accurately the analyst’s views about the context matter.  The same data set can allow some to conclude legitimately that executions deter murder and also allow others to conclude that there is no deterrent effect, because they see the context differently.  While it’s not the only kind of specification search, per my book, an “interpretative search” combines the data information with the analyst’s ambiguous and ill-defined understanding of the context.  The Bayesian philosophy offers a perfect hypothetical solution to the problem of pooling the data information with the prior contextual information – one summarizes the contextual information in the form of a previous hypothetical data set.

A HUGE hypothetical benefit of a Bayesian approach is real transparency both to oneself and to the audience of readers.  Some people think that transparency can be achieved by requiring researchers to record and to reveal all the model exploration steps they take, but if we don’t have any way to adjust or to discount conclusions from these specification searches, this is transparency without accountability, without consequence.   What is really appealing about the Bayesian approach is that the prior information of the analyst is explicitly introduced into the data analysis and “straightforwardly” revealed both to the analyst and to her audience.   This is transparency with consequence.  We can see why some think executions deter murders and others see no deterrent effect.

The frustratingly naïve view that often meets this proposal is that “science doesn’t make up data.”   When I hear that comment, I just walk away.  It isn’t worth the energy to try to discuss how inferences from observational data are actually made, and for that matter how experiments are interpreted as well.   We all make up the equivalent of previous data sets, in the sense of allowing the context to matter in interpreting the evidence.   It’s a matter of how, not if.  Actually, I like to suggest that the two worst people to study data sets are a statistician who doesn’t understand the context, and a practitioner who doesn’t understand the statistical subtleties.

However, we remain far from a practical solution, Bayesian or otherwise, and current practice is more or less the same as it was when punch cards were fed into computers back in the 1960s.  The difference is that with each advance in technology from counting on fingers to Monroe calculators to paper tapes to punch cards to mainframes to personal computers to personal digital assistants, we have made it less and less costly to compute new estimates from the same data set, and the supply of alternative estimated models has greatly increased, though almost all of these are hidden on personal hard drives or in Rosenthal’s File Drawers.

The classical econometrics that is still taught to almost all economists has no hope of remedying this unfortunate situation, since the assumed knowledge inputs do not come close to approximating the contextual information that is available. But the Bayesian priests who presume the existence of a prior distribution that describes the context are not so different from the econometric theorists who presume the existence of a model.  Both are making assumptions about how the dirty work of data analysis in the basement is done or should be done, but few of either religious persuasion leave their offices and classrooms on the top floor and descend into the basement to analyze data sets.  Because of the impossibility of committing to any particular prior distribution, the Bayesian logic turns the search for a model into a search for a prior distribution. My solution to the prior-ambiguity problem has been to design tools for sensitivity analysis to reveal how much the conclusions change as the prior is altered, some local perturbations (point to point mapping) and some global ones (correspondences between sets of priors and sets of inferences).

As I read what I have just written, I think this is hugely important and highly interesting.  But I am reminded of the philosophical question:  When Leamer speaks and no one listens, did he say anything?   None of the tenured faculty in Economics at Harvard took any interest in this enterprise, and they gave me the Donald Trump treatment: You’re fired.   My move to UCLA was to some extent a statement of approval for my book, Specification Searches, but my pursuit of useful sensitivity methods remained a lonely one.  The sincerest form of admiration is copying, but no one pursued my interest in these sensitivity results. I did gain notoriety if not admiration with the publication of a watered down version of my ideas in “Let’s take the con out of econometrics.” But not so long after that, finding that I was not much affecting the economists around me, and making less progress producing sensitivity results that I found amusing, I moved onto the study of International Economics, and later I took the professionally disreputable step of forecasting the US macro economy on a quarterly basis, back to my Michigan days.   I memorialized that effort with the book titled Macroeconomic Patterns and Stories, which is an elliptical comment that we don’t do science, we do persuasion with patterns and stories.  And more recently, I have tried again to reach my friends by offering context-minimal measures of model ambiguity which I have called s-values (s for sturdy) to go along with t-values and p-values.    This one-more attempt illustrates what is the fundamental problem – we don’t have the right tools.

It is my hope that the Leamer-Rosenthal prize will bring some added focus on these deep and persistent problems with our profession, stimulating innovations that can produce real transparency by which I mean ways of studying data and reporting the results that allow both the analyst and the audience to understand the meaning of the data being studied, and how that depends on the contextual assumptions.

This whole thing reminds me of the parable of the Emperor’s New Clothes.  Weavers (of econometric theory) offer the Emperor a new suit of clothes, which are said to be invisible to incompetent economists and visible only to competent ones.  No economist dares to comment until a simple-minded one hollers out “He isn’t wearing any clothes at all.”   The sad consequence is that everyone thinks the speaker both impolite and incompetent, and the Emperor continues to parade proudly in that new suit, which draws repeated compliments from the weavers:  Elegant, very elegant.

OK, it’s delusional.  I know.


leamer1 zoomedAbout the author: Edward Leamer is the Chauncey J. Medberry Professor of Management, Professor of Economics and Professor of Statistics at UCLA.  He joined the University of California at Los Angeles in 1975 as Professor of Economics and served as Chair from 1983 to 1987. In 1990 he moved to the Anderson Graduate School of Management and was appointed to the Chauncey J. Medberry Chair. Professor Leamer is a Fellow of the American Academy of Arts and Sciences, and a Fellow of the Econometric Society. He is a Research Associate of the National Bureau of Economic Research and a visiting scholar at the International Monetary Fund and the Board of Governors of the Federal Reserve System. He is currently serving as the Director of the UCLA Anderson Forecast. Dr. Leamer has published over 100 articles and 4 books.