What’s In a Name?


SYRACUSE, NY – When currency trader George Soros rolled out his Institue for New Economic Thinking last spring, in a conference at Cambridge University, the writer of The Economist’s “Economic Focus” feature who covered the conference invoked a metaphor devised by David Colander, of Middlebury College, to describe the Institute’s view of how economics may advance.  Investigators climb up one mountain, only to discover when they break through the clouds that an adjacent mountain, even higher, affords a better view. Thus the twin peaks vision of getting wise to ourselves.

The mountain to which Colander says economists cling he has dubbed Mt. Walras, after the nineteenth-century French economist who envisaged most clearly the formal analysis of mutual interdependence through a balance of forces that today is known as general equilibrium analysis. (I would have thought the most adventurous of them already had moved next door to the Game Theory Massif.) The peak in which Colander is interested he describes (elsewhere) as the foothills of “the complexity perspective.” You’ve got to step off the old mountain, lose some altitude, he says, in order to start up the new hill. 

And that’s what I was doing in Syracuse last week. John Bryan Davis, of Marquette University: and the University of Amsterdam, and Philip Mirowski, of the University of Notre Dame, each of them a complexity Sherpa, were there to talk about “Complexity, the Crisis and Economics” at the annual meeting of the History of Economics Society.

My job was to provide outsider’s perspective.  I’m a journalist, not an economist or historian. But in 1984 I published a book I called The Idea of Economic Complexity. Davis and Mirowski were curious about my view of “what has happened to the topic in the economics profession” in the last twenty-five years. 

Has complexity entered the history of economic thought?

My book attracted little attention, but it was right about one thing.  Complexity, as I wrote, was “an idea on the tip of the modern tongue.” George Cowan, director of physics at Los Alamos National Laboratory, founded the Santa Fe Institute that same year “to explore complexity in science and technology,” with US government backing and advice from Nobel laureates Kenneth Arrow, an economist, and Philip W. Anderson, a physicist.

Then came a series of books, many of them centered on the Santa Fe Institute: the best-selling Chaos: Making a New Science , by James Gleick, then a science writer for The New York Times; Complexity: The Emerging Science at the Border of Order and Chaos, by N. Mitchell Waldrop, Complexity: Life at the Edge of Chaos, by Roger Lewin The Dreams of Reason: The Computer and the Rise of the Sciences of Complexity, by Heinz Pagels;  At Home in the Universe: The Search for the Laws of Self-Organization and Complexity, by Stuart Kauffman;  Hidden Order: How Adaptation Builds Complexity, by John Holland; The Quark and the Jaguar: Adventures in the Simple and the Complex, by Murray Gell-Mann; and, most recently, The Origin of Wealth: Evolution, Complexity, and the Radical Remaking of Economics, by Eric Beinhocker. (Santa Fe original Doyne Farmer is the subject of two books by Thomas Bass: The Eudaemonic Pie and The Predictors: How a Band of Maverick Physicists Used Chaos Theory to Trade their Way to a Fortune on Wall Street.)

At a certain point, John Reed, chairman of Citicorp, signed on, providing funding to Santa Fe. From England, the cosmologist Stephen Hawking provided the money quote: “I think the next century will be the century of complexity.”

As the books rolled on, though, it seemed to me we were hearing more about the room in which the scientists gave their talks to one another at Santa Fe, the chapel of a seventeenth-century convent that housed the Institute, than about the impact of what was being said upon the world of science. I confess I gave up complexity after Waldrop’s book appeared in 1992. That’s the account in which the reporter, a biologist, explains how an operations researcher, Brian Arthur, was fomenting an intellectual revolution in economics at whose center was the economics of falling costs. By then I was reporting about what was known as the “new growth economics,” which was doing increasing returns from a very different angle.

In 1995, the science journalist John Horgan asked, in Scientific American, “Is Complexity a Sham?” and answered in the affirmative. It seemed to me that Santa Fe’s funding was predicated as much on its proximity to George (Jay) Keyworth’s summer house (he had been science adviser to President Ronald Reagan) as to the acuity of Kenneth Arrow’s intuition.

By then, though, the consultants had got hold of complexity, and soon were giving conferences for business executives eager to bone up on the next big idea. (Sample come-on, a quote from playwright Tom Stoppard: “A door like this has cracked open five or six times since we got up on our hind legs.  It’s the best possible time to be alive, when almost everything you thought you knew was wrong.”)

As one wag said, complexity seemed to be most prevalent in its gaseous form. 

Two things changed my mind. One was a short article by the quantum engineer Seth Lloyd, of the Massachusetts Institute of Technology, in the Institute for Electrical and Electronic Engineers’ Systems Control magazine. The other was a paper by the economist Paul Romer in the Journal of Political Economy. So I resumed saving scraps. Those are what I took with when I agreed to come to Syracuse.

The Seth Lloyd article appeared in 2001.  He had cataloged more than fifty different measures of complexity – computational complexity, logical depth, information, entropy, fractal dimension, hierarchical complexity, topological epsilon-machine, channel complexity, tree subgraph diversity, and so on. He grouped them according to the kind of question each was designed to answer when groping for “the complexity of the thing (house, bacterium, problem, process, investment scheme):” how hard was it to describe?  How hard to create?  What was its degree of organization?

Lloyd has an interesting history. Horgan had used the list of complexity flavors he got from Lloyd to disparage the quest; Lloyd had so many requests for his list that he finally wrote it up for the magazine, two pages, six years after the fact.  It quickly became his most cited paper.  (He is doing quantum computing now.)  He offered to debate Horgan, defending complexity, but nothing ever came of it. (Lloyd, too, wrote a Santa Fe book: Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos.)  

What caught my eye was an analogy. Perhaps the problem of measuring complexity, Lloyd wrote, was akin to the problem of describing electromagnetism before Maxwell’s equations. For eighty years after Franklin, electrical and magnetic phenomena continued to be regarded as fundamentally different forces. Michael Faraday showed they were closely related; indeed, he invented the electric motor (which he called a “magneto”.) But it wasn’t until Maxwell that the underlying unity of the electromagnetic field began to be properly understood. 

In the same way, Lloyd wrote, contemporary researchers in architecture, biology, computer science, dynamical systems, engineering, finance, game theory, etc., were asking similar questions about the complexity of their different subjects, and finding that their answers have much in common, even though they had defined different measures of complexity for each field

That sounded plausible enough to me, so I perked up. I knew that hedge fund operators were making money on all kinds of pattern recognition, not necessarily publishing the results. Maybe they are the latter-day Faradays. I knew, too, about Zipf’s Law, an empirical regularity first reported in 1935 by linguist George Zipf, who observed that the frequency with which an English word was used as inversely proportional to its rank in the frequency table (“the,” the most frequently word, appears twice as often as “of,” the second most frequently used, which in turn is used three times as often as number three, and so on.) Similar “power laws” seem to describe the size rank of cities, of corporations, perhaps even the distribution of income.

So I reopened the file. I subscribed to Complexity Digest, an extremely interesting free newsletter edited by Carlos Gershenson in Mexico City. (It turns out there is a substantial complexity community in Cuba!)  Much of the work of the last twenty-five years is summarized in Complexity: A Guided Tour, by Melanie Mitchell, a computer scientist (and Santa Fe science board member). Her last chapter begins by rehashing the hurt administered by Horgan. Her analogy was to the nineteenth- century advent of therodynamics. The in-joke, she says, is that complexity is waiting for (Sadi) Carnot

What about complexity in economics?  I scanned the New Palgrave Dictionary of Economics (Second Edition),  which included several entries, including an article on “complex adaptive systems” by co-editor Steven Durlauf, of the University of Wisconsin, himself a long-time denizen of the Santa Fe Institute.  I attended Barkley Rosser’s “Transdisciplinary Perspectives” conference in 2008 and wrote about it. (Horgan came, too). I decided that I would read The Difference: How the Power of Diversity Creates Better Groups, Firms, Scools and Societies, by Scott Page, of the University of Michigan, when the opportunity arose.

And I looked at Herbert Gintis’s thirteen-page review of Beinhocker’s The Origin of Wealth in the Journal of Economic Literature, in which Gintis enumerates “Five Big Ideas of Complexity” (dynamics, agents, networks, emergence and evolution). Those are five big ideas, all right, but I am not certain that they are exclusive to complexity. And once again, it is a Santa Fe scholar reviewing a book about the Santa Fe program. For a low-key look at the Santa Fe program, see Samuel Bowles.)

The main currents in economics, or at least in macro, are to be found in two conference volumes put together by Colander ten years apart. The first, which appeared in 1996, is Beyond Microfoundations: Post Walrasian Economics and the second, in 2006, is Post Walrasian Macroeconomics: Beyond the DSGE model. I was struck by the fact that the word complexity, which, as I recall, was big in the conference that led to the ’96 volume, had all but dropped out of that discourse ten years later. It barely appears in the Colander introduction, having been replaced by an emphasis on pervasive cognitive limitations. (“Taking complexity seriously presents a major challenge to any rational expectations of the macromodel.”)

That’s where the excitement about “agent-based modeling” comes in.  A bottom-up technique made possible by nearly limitless computer power, agent-based techniques endow simulated agents in economic situations with a handful of local decision-making rules and then just let the model run, in order to see which strategies win out. It was nearly fifty years ago that Thomas Schelling, of the University of Maryland, demonstrated with a simple checkerboard how such methods could explain how racial “tipping” might have produced observed patterns of racial segregation in American cities. “How could one not be persuaded by arguments [by Leigh Tesfatsion, of Iowa State University; Rob Axtell, of George Mason University; and Blake LeBaron, of Brandeis] for more agent based modeling?” writes Alan Kirman, of the Institute for Advanced Study at Princeton and the University of Marseilles, in the forward to Colander’s book.

I’m with him.

 .                                                         xxxxx

The other thing that persuaded me to think a little more complexity was a series of papers in the mid-1980s that culminated in Romer’s 1990, “Endogenous Technological Change.” Here I just want to mention where the word came from in my case — not from Romer; not from the late Peter Albin, who played the lead role in The Idea of Economic Complexity (Albin was sidelined by a massive stroke when, just before he was slated to succeed Hyman Minsky at the Levy Institute); but from Allyn Young, another economist who died too young – at 52, in the influenza epidemic of 1929, just as he was preparing to return to Harvard from the London School of Economics.

In “Increasing Returns and Economic Progress,” his presidential address to the Royal Economic Society in September 1928, Young argued that Adam Smith’s dictum, that the division of labor was limited by the extent of the market, had been both neglected and misunderstood. The expectations that costs must generally rise and profits fall was widespread among economists, he noted; in fact, the opposite was far more commonly the case, because increasing specialization fostered an ever-broadening market (and vice versa). The mechanism couldn’t be adequately discerned by scrutinizing the individual firm or the particular industry; industrial operations had to be seen whole.  By way of illustration, he described the evolution of the printing trades.

The successors of the early printers, it has often been observed, are not only the printers of today, with their own specialized establishments, but also the producers of wood pulp, of various kinds of paper, of inks and their different ingredients, of type-metal and of type, the group of industries concerned with the technical parts of the producing of illustrations and the manufactures of specialized tools and machines for use in printing and in the various auxiliary industries.

It was to describe this process that Young employed the word “complexity”:

Notable as has been the increase in the complexity of the apparatus of living,  as shown by the variety of goods offered in consumers market, the increase in the diversification of intermediate products and of industries manufacturing special products or groups of products has gone even further. 

Young’s “apparatus of living” was “the division of labor” to Adam Smith – what a present-day economist means by “development.” But go looking for “degree of development” as a live variable in growth, much less macroeconomics, and you’ll find it is simply not there. Instead you must settle for the more elusive concepts of “technological change” and “institutions.” Back when I borrowed the idea of “complexity” as a potential yardstick to describe the vast changes that had occurred in the division of labor, this dimension of things was, as nearly as I could tell, invisible in mainstream economics. 

This mystery was substantially solved by Romer, almost incidentally, in the course of building a model of economic growth that included the production of knowledge as an economic good. He located the answer in the way that the great English logician and economist John Stuart Mill laid down the architecture of modern economics in Principles of Political Economy in 1848, towards the end of the “Preliminary Remarks” with which he  began the book:

In so far as the economical condition of nations turns upon the state of physical knowledge, it is a subject for the physical sciences, and the arts founded on them. But in so far as the causes are moral or psychological, dependent on institutions and social relations, or on the principles of human nature, their investigation belongs not to physical, but to moral and social science, and is the object of what is called Political Economy.

So much the worse for economics, then, if it rules out having anything to say about the growth of knowledge. (Here’s Mill take on the economics of transportation: “A canal or a railway embankment cannot be made without a combination of many laborers; but they are all excavators, except the engineers and a few clerks.”) The history of the division of labor, even today, is the topic that the best and highest use to which the idea of economic complexity can be put. The closest anyone has come to it so far, it seems to me, is Ricardo Hausmann, of Harvard’s Kennedy School of Government, who with several colleagues has promoted a version of niche space that is intuitively appealing (“The Product Space Conditions the Development of Nations,” Science, 27 July 2007).  Interestingly enough, Harvard’s Hendrik Houthakker  compared the division of labor to speciation in 1956.  His model went nowhere.

Indeed, the most recent thing to catch my eye is Biology’s First Law: The Tendency for Diversity and Complexity to Increase in Evolutionary Systems, by Daniel McShea and Robert Brandon, a couple of philosophers of biology, both of Duke University  (University of Chicago, forthcoming).  They describe three aspects of life on earth that seem to requite explanation: adaptation, diversity and complexity. Adaptation is “the fit between organism and environment”; diversity is “the great variety of organisms”; and complexity, “the enormous intricacy of their internal structure.” Natural selection explains adaptation, they say. But what explains diversity and complexity? A spontaneous tendency toward increased diversity and complexity, whether or not natural selection is present, is the “first law of biology,” they say. Eventually it will require the same kind of reconceptualization of the field that Newton’s First Law brought to physics.

They may be right.  To proclaim it themselves as law seems kind of grandiose (Yes, McShea did a post-doc at Santa Fe). On the other hand, it is the sort of insight I thought must soon be forthcoming when I wrote twenty-five years ago “You would have thought that the existence of varying degrees of complexity would have been recognized and dealt with long ago, but the discovery of complexity is one of the major – if unheralded – scientific stories of our time.”  So who am I to complain? I plan to continue to read Complexity Digest.


2 responses to “What’s In a Name?”

  1. Hey would you mind letting me know which web host you’re using? I’ve
    loaded your blog in 3 different web browsers and I must say
    this blog loads a lot quicker then most. Can you suggest a good hosting provider at a fair price?

    Thank you, I appreciate it!

Leave a Reply

Your email address will not be published. Required fields are marked *