From the Armchair to the Computer


It may not be an accident that, in reporting the award last week of the Nobel Memorial Prize in Economic Sciences to Thomas Sargent, of New York University, and Christopher Sims, of Princeton University, The New York Times referred on subsequent reference to “Dr. Sargent” and “Dr. Sims.”

The honorific was a promotion.  Last year Peter Diamond, Dale Mortensen and Christopher Pissarides were each “Professor” on second reference. Mr. and Ms. were good enough for Oliver Williamson and Elinor Ostrom the year before (though in previous years, “Professor” was occasionally employed on second reference, depending, apparently, on the relative saltiness of the reporter).

But then Sargent and Sims are among the most difficult to read of contemporary economists, and the most difficult to explain in the everyday language (and depth of field) of newspapers. They are also among the relative handful of economists who are members of the National Academy of Sciences. So there may be some relation between the latter circumstances and the former. In Times Nobel stories, physicists and chemists have been Drs. for years.

The citation of the Swedish Riksbank Prize in Economic Sciences in Memory of Alfred Nobel  specified that the award had been given for “applied macroeconomics,” a field that was transformed after Sargent, Sims and a dozen others started work in the early 1970s.

It’s not that the macroeconomics was ever short on empirical work. Robert Gordon, of Northwestern University, notes that Simon Kuznets worked on its problems from the beginning, that Milton Friedman and Anna Schwartz published their A Monetary History of the United Statesin 1963; and Franco Modigliani, at his core an empirical macroeconomist, was one of the chief architects of large-scale models that are still in use today by the best macro forecasting firms, including MacroAdvisers. Yet it is theory that determines what we can observe. And the history of applied macroeconomics can be sketched in terms of four dominating events:

In The General Theory of Employment, Interest and Money, in 1936, John Maynard Keynes successfully argued for the existence of a special branch of economics, macroeconomics, or a theory of output as a whole, as something apart and, he asserted, quite different from the operation of individual markets, or microeconomics, as the more familiar tradition, described by Keynes as “classical” economics, became known.

In 1960, Paul Samuelson and Robert Solow, both of the Massachusetts Institute of Technology, found a striking relationship between inflation and unemployment in US data – a Phillips curve, they called it, after the New Zealander who first delineated the relationship in British data, in which higher inflation was associated with lower unemployment. Such a trade-off might be exploitable for policy purposes, they wrote (with many qualifications): governments could reduce unemployment by raising the inflation rate.

In 1976, Robert Lucas, of the University of Chicago, published an especially difficult paper, “Econometric Policy Evaluation: A Critique,” in which he ratified earlier intuitions by Edmund Phelps and Milton Friedman that, because individuals’ expectations, broadly speaking, would defeat policymakers’ intentions, no such trade-off would exist in the long run.  He demonstrated why the econometric tests then in use cavalierly rejected their views. The same year, British central banker Charles Goodhart put the same idea more simply: “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” And whether by Goodhart’s Law or the Lucas Critique (or as a result of a little group of applied explications by Lucas, Sargent, Sims and others that convincingly demonstrated the import of critiques and laws), forward-looking expectations had been placed at the center of economics. Macroeconomics was becoming behavioral.

Finally, there came the dominating experiment in the control of inflation conducted by Federal Reserve chairman Paul Volcker. Soon after taking office, in 1979, Volcker raised the federal funds interest rate to 20 percent from 11 percent and held it high as long as he dared – until August 1982. Unemployment soared, factory utilization plummeted, but inflation dropped from 13.5 percent to 3.5 percent. The recession ended, growth resumed, and as the economy snapped back, financial assets soared, but inflation remained low for the next 25 tears. Many other interesting policy interventions were in the works, fiscal policy in particular, but the inflation rate was the big story.

Sargent and Sims and their respective communities have been at work on these episodes ever since, connecting theory and practice, building models and adopting new statistical techniques to judge the policies that central banks, especially the Fed, were employing to control the economy and manage interest and inflation rates. They built more elaborate models, ones that did not embed Keynesian assumption of sticky wages in the framework of setups used to judge policy interventions. (See this column, by Louis Johnston, of Saint Benedict and Saint John’s University, for an insider’s view.)  Sims, in particular, along with Robert Litterman, long gone to Wall Street, introduced economists to a technique known as vector auto-regression that has permitted them to compare forecasts with much more precision than formerly, and so make better ones.

At a 1996 conference celebrating the twentieth anniversary of the Lucas Critique, Sargent wrote, “The intellectual retreat of the Phillips curve helps account for the resurgence of low-inflation monetary regimes throughout the world today.”  A couple of years later, he published The Conquest of American Inflation, in which he sought to show exactly how policy-makers had learned and applied the lessons of the previous twenty years, including a little self-subversion. Samuelson and Solow had been well aware of the limits of the “trade-off” that they identified; and central bankers, who remained wedded to various “augmented” versions of Phillips curves to inform their views, had been able to control inflation surprisingly well. (This was typical. Earlier this year Sargent became the second economist to win the ultimate good citizenship medal in science, the National Academy’s Award for Scientific Reviewing, given annually since 1979 for synthesizing an extensive and difficult literature. James Poterba, of MIT and the National Bureau of Economic Research, was the first. )

It’s all hearsay, especially on Sunday morning.  If you want to learn a little more about structural macro, see this primer by Narayana Kocherlakota, president of the Federal Reserve Bank of Minneapolis. For a glimpse of the utility of  VAR techniques, see these lecture slides. In recent years, Sargent and Sims have expanded their investigations in many directions.  Sargent, in particular, has been an occasional dabbler in economic history, searching for dramatic examples of points he wants to make through subtle theorizing.  The Nobel citation describes how he has pursued a line he calls “robust control,” implying that households routinely adopt expectations that are somewhat more pessimistic than would be the case if they were purely rational, while Sims has investigated a phenomenon he calls “rational inattention,” which stems from the brain’s limited information-processing capability. Sargent is consistently conservative in his views, while Sims, as Pete from Hoboken noted on Paul Krugman’s blog last week, “liked to talk an awful lot in his grad macro class about New Keynesian stuff. His whole Rational Inattention push seems about as ‘sticky’ as it gets.”

There is, of course, a school of thought among economists that holds that the sophisticated mathematics of empirical macroeconomics is little more than a blind alley.  In Beyond Mechanical Markets, Roman Frydman, of New York University, and Michael Goldberg, of the University of New Hampshire, deride the presumption that individuals make decisions “as if they adhered strictly and permanently to overarching mechanical rules that economists themselves fully specify in advance.”  They offer instead an “economics of imperfect knowledge,” in which individuals’ clumsy interpretations of fundamentals drive price swings and government’s responsibility is to counter their mis-assessments.

And Arnold Kling, an MIT-trained economist with Austrian leanings, who blogs on the excellent Liberty Fund site, wrote last week, “[I]f Sargent and Sims represent a slap in the face to Keynes, they must be regarded as a knee to the groin of {Friedrich] Hayek. Hayek coined the term ‘scientism’ to describe the pretentious pose that economists strike when they equate mathematics with rigor. If scientism is a germ that infects economics, then Sargent and Sims were responsible for unleashing some of the most virulent strains.”

In the early days of the Nobel prize in economics, there was a tradition of trying to put into one sentence the reasons for which laureates had been awarded their prize (don’t put all your eggs in one basket, people save for a rainy day, etc.). Sargent, before he picked up his CME Group-MSRI Prize in Innovative Quantitative Applications in Chicago last month, wrote a concise description of the program of formal logic that he had so much to do with turning into mainstream economics.

The assumption that people share common beliefs about underlying sources of uncertainty [that is, that expectations are rational] underpins influential doctrines in modern economics, including (1) how the distinction between expected and unexpected government actions affect inflation-unemployment dynamics; (2) how to cast optimal fiscal and monetary policy as a dynamic mechanism designed to cope with enforcement and information limits; (3) how timing protocols that capture a government’s ability to commit give rise to optimal fiscal and monetary policies that are time inconsistent; and (4) how reputation can substitute for commitment.

Because central banks and treasuries want to implement solutions of optimal policy problems like (2) in contexts like (1), in which the distinction between foreseen and unforeseen policy actions is important, a time consistency problem like( 3) arises, prompting them to focus on ways like (4) to sustain good expectations. By showing how to modify the common beliefs assumption [the presumption that everyone’s expectations are the same] in ways that take account of both adaptive learning and of economic agents’ response to statistical model uncertainty, economic research has also set down foundations for better models in the future.

That’s three sentences. It’s a pretty good summary, though, of what Sargent means when he describes macroeconomics as having traded, for its chief tool, the armchair for the computer.

I tried my hand at translation in, but was not very satisfied with the results.

Governments inevitably try to fool people, at least some of the time. Most people are not fooled, and certainly not for long. Better governments await a better understanding of men and women, and a better understanding of their rules.

EconomicPrincipals readers can do better than this. A book prize, and publication in this space on December 11, for the best translation of Sargent’s hieratic formulation into a demotic one.


4 responses to “From the Armchair to the Computer”

  1. Economics, because it is wrapped around people, is not an exact science. Our hopes (and consequent actions) are more often not about interest rates but about how we want to see ourselves. Governments should nurture our hopes.

  2. 1. Pay no attention to the man behind the curtain. 2. Pay attention to the man behind the curtain.

Leave a Reply

Your email address will not be published. Required fields are marked *