Complexity, heuristics, and the traveling salesman problem

Add this one to your long reads queue, because it’s well worth it: Tom Vanderbilt writes in Nautilus about the traveling salesman problem and how algorithmic optimization helps us understand human behavior more deeply. It’s a thorough and nuanced analysis of the various applications of algorithms to solve the traveling salesman problem — what’s the most efficient way (which you of course have to define, either in time or money or gasoline etc.) to deliver some number n of packages to some number d of destinations, given your number t of trucks/drivers? This is a tough problem for several reasons, and Vanderbilt’s discussion of those reasons is clear and interesting.

We can start with a simple transportation model with small numbers of packages, destinations, and trucks. But as the number of them increases, the problem to solve becomes increasingly complex, increasing at least exponentially if not more. Then think about what happens when the locations of the destinations change every day, as is the case for UPS and FedEx deliveries. Then think about what happens when you add in heterogeneity of the deliveries; Vanderbilt opens with a girl pointing out that her mother would never buy perishables and then leave them in the car all day, so the nature of the item changes the constraints on the definition of the efficient route.

Her comment reflects a basic truth about the math that runs underneath the surface of nearly every modern transportation system, from bike-share rebalancing to airline crew scheduling to grocery delivery services. Modeling a simplified version of a transportation problem presents one set of challenges (and they can be significant). But modeling the real world, with constraints like melting ice cream and idiosyncratic human behavior, is often where the real challenge lies. As mathematicians, operations research specialists, and corporate executives set out to mathematize and optimize the transportation networks that interconnect our modern world, they are re-discovering some of our most human quirks and capabilities.

One of the most intriguing aspects of implementing logistics that reflect good solutions to the TSP that Vanderbilt highlights is the humanity of the driver. Does the dispatcher know the driver, know that s/he is reliable or not? That may affect how to define the route, and how many stops or changes to put on that particular person’s schedule. Is the driver prone to fatigue, and how does that fatigue affect the driver’s decision-making? What are the heuristics or rules of thumb that different drivers use to make decisions in the face of uncertainty given the cognitive limitations of humans? How different will the different heuristics of different drivers be, and how to they affect the logistics of the system?

What Vanderbilt finds is that good logistics systems take the organic, emergent system that incorporates those heuristics into account when devising the TSP algorithm. They leave a human component in the logistics, but also use the human component to inform and change the algorithm. Another important element is data, because all such algorithms are going to work in conjunction with such data as location. GIS mapping capabilities improve the data used to establish, test, and monitor TSP algorithms.

Economic theories as recipes

Michael Giberson

Today marks the 100th anniversary of the birth of Julia Child, and surprisingly, it was a blog post on economic theorizing reminded me of the famous cookbook author this morning. I’m sure it wasn’t quite the metaphor Rajiv Sethi had in mind when he posted “On Prices, Narratives, and Market Efficiency,” but the article suggested to me that just maybe economic theories are a bit like recipes.

Sethi’s article reacts to John Kay’s response to Robert Lucas’s article addressing the Queen of England’s question: Why had economists failed to predict the financial crisis? More directly, Sethi explores the issue of how economists should change the way they work in order to better understand economic fluctuations. Sethi likes Kay’s formulation, of prices as the “product of a clash between competing narratives,” and he ties it usefully to some existing theorizing by Michael Harrison and David Kreps on speculators with heterogeneous beliefs.

The idea [from Harrison and Kreps] that prices are “obtained through market aggregation of diverse investor assessments” is not too far from Kay’s more rhetorically powerful claim that they are “the product of a clash between competing narratives”.  What Harrison and Kreps do not consider is how diverse investor assessments change over time, since beliefs about transition probabilities are exogenously given in their analysis. But Kay’s formulation suggests how progress on this front might be made.

Kay suggests that more than just deductive logic will be needed to understand the variety of conflicting narratives and how they change; Sethi, while agreeing, adds that deductive reasoning could be pushed still further.

The economic theories as recipes metaphor suggests that economics is just a way to get from a bundle of raw elements, different kinds of data and interpretations, to a more useful product. Unsettling in such a metaphor is that recipes are many and diverse, and unifying principles appear to be hard to find. There is no general theory of cooking that will tell us the one best way to combine a given set of raw materials into the perfect finished product. It depends on what you want to produce, and what the tools are at hand.

Perhaps too there is no one best way to analyze a given set of observations into the perfect slice of economic understanding; perhaps there is no general theory of economics. This is not Sethi’s argument, and it is not even my own settled view. Yet if the economy itself is always open to entrepreneurial insight, which is one way to say we are never at a point in the economy where improvement is impossible, then economics itself is similarly always open to insight and improvement.

Doing what seems like it should work: Experiments, tests, and social progress

Michael Giberson

My title is a little grand, at least the “and social progress,” but maybe it will be justified in some later, more carefully worked out version of the ideas clashing about in my head. As this is a blog, I’m sharing the more immediate, less carefully worked out version. ;-)

I’ve been reading Redirect: The Surprising New Science of Psychological Change. My wife brought it home from the library and then recommended it to me. (Thanks!) The book makes some surprisingly strong claims for personal improvements from what the author calls “story editing,” a bundle of techniques that subtly (or sometimes not so subtly) get people to revise their self narratives. (More from the Scientific American blog, more from Monitor on Psychology.)

Counterpart to that focus of the book is an emphasis on testing social and psychological interventions to discover what actually seems work. Author Timothy Wilson details numerous self-help and social change projects, some of which capture millions or even billions of dollars in public support, which seem like they should work but when subjected to careful evaluation show no evidence of success. In fact, some very expensive programs actually seem to worsen the problem that the program was designed to fix: programs to fight teenage smoking that lead to higher smoking rates, programs to discourage teenage pregnancy that lead to higher pregnancy rates, efforts to discourage littering – or cheating – on campus that have the opposite effects. Wilson advocates a strong preference for testing social interventions with randomized control experiments when possible (and ethical). When randomized control tests are not possible, then other attempts at measurement and replication are important even though difficult to do well.

Whether or not “story editing” is key to successful personal and social change – Wilson makes a strong case, but he could be cherry picking his evidence and I’m sure he has his professional critics – the emphasis on experimentation and testing interventions is an important one.

Lynne’s posts last week on experimentation in social contexts are related: Economic experimentation, economic growth, and regulation and Experimentation, Jim Manzi, and regulation/deregulation. I’m most of the way through Russ Robert’s EconTalk interview with Jim Manzi that Lynne mentioned in second-listed link (recommended); Manzi makes related arguments in favor of well-designed experiments where possible, and for trial and error experimentation where controlled experimentation is not possible.

In both Wilson’s book and the Manzi interview (and apparently in Manzi’s book Uncontrolled, which I haven’t read yet), the limits of multivariate analysis of naturally generated data – i.e., almost all econometric analysis – are examined and found wanting. As Manzi explains, “omitted variable bias” is massive when examining data on human systems; the systems are simply too complex to produce reliable, non-obvious predictions via multivariate analysis because you cannot control for all of the possible effects and interactions influencing the data. He suggests that while 90 percent of studies relying on well-designed randomized control experiments are subsequently replicated, that figure drops to 20 percent or so for studies relying primarily on well-designed multivariate analysis.

In a post on the deterrence effect of the death penalty, Timothy Taylor provides an example of the difficulties of using multivariate analysis to examine social policy. Taylor draws on a recent National Research Council study on the topic, which like a similar study published in 1978 has concluded “available studies provide no useful evidence on the deterrent effect of capital punishment.” Taylor then explains several reasons why it has been hard to draw firm conclusions from the data. While he doesn’t use the term “omitted variable bias,” it is among the problems that the NRC study finds hampering results in this area.

The views of both Wilson and Manzi, and the case study on the effects of the death penalty, all point to a certain humility concerning our claims to understand how the world works. But humility isn’t the end of the story, it isn’t an argument to stop; it is an argument to trust our beliefs about the social world less conclusively and also to trust them selectively: trust knowledge derived from replicated randomized-control experiments most, trust knowledge from replicated multivariate analysis much less, trust knowledge based on trial and error learning less as well.

These ideas will, once better worked out in my head, probably also mention Vernon Smith’s work on constructivist and ecological rationality. Of course, V. Smith is known to be a fan of experimental approaches to understanding social phenomena as well.

The constructivist way forward: experimentation! testing! social progress!

New paper: Knowledge Problem

Lynne Kiesling

I have a new paper that may be of interest to KP readers, since the subject of the paper is the same as the name of this site: Knowledge Problem. I am honored to have been invited to contribute this paper to the forthcoming Oxford Encyclopedia of Austrian Economics (Peter Boettke and Chris Coyne, eds.). Here’s the abstract:

Hayek’s (1945) elaboration of the difficulty of aggregating diffuse private knowledge is the best-known articulation of the knowledge problem, and is an example of the difficulty of coordinating individual plans and choices in the ubiquitous and unavoidable presence of dispersed, private, subjective knowledge; prices communicate some of this private knowledge and thus serve as knowledge surrogates. The knowledge problem has a deep provenance in economics and epistemology. Subsequent scholars have also developed the knowledge problem in various directions, and have applied it to areas such as robust political economy. In fact, the knowledge problem is a deep epistemological challenge, one with which several scholars in the Austrian tradition have grappled. This essay analyzes the development of the knowledge problem in its two main categories: the complexity knowledge problem (coordination in the face of diffuse private knowledge) and the contextual knowledge problem (some knowledge relevant to such coordination does not exist outside of the market context). It also provides an overview of the development of the knowledge problem as a concept that has both complexity and epistemic dimensions, the knowledge problemʼs relation to and differences from modern game theory and mechanism design, and its implications for institutional design and robust political economy.

In this paper I analyze the development of the two categories of the knowledge problem — the complexity knowledge problem and the contextual knowledge problem — and explore both the history of the development of these concepts and their application in robust political economy and new institutional economics. As is the hallmark of a good research project, I think on balance I learned more than I created in the process of writing this paper.

One other thing I made sure to include was a discussion of how the knowledge problem and its development relates to game theory and mechanism design, through the work of Oskar Morgenstern (and then through some of the work of Herb Simon and Vernon Smith, among others).

Tying together economics, institutional design, history of thought, and epistemology, I hope you find this paper informative and useful! I’ll also make sure to update when the full volume is available.

Music, harmony, and social cooperation

Lynne Kiesling

I am a big fan of English renaissance choral music, particularly sacred polyphony from Tallis and Byrd (and stretching back to Taverner, but he’s not as distinctively polyphonic). One of the best ensembles performing such music is Stile Antico, a group of 13 British singers who do an outstanding job with this music, and whose recordings I have recommended here before. Especially at this time of year, their music really resonates and adds joy and beauty to life.

A couple of weeks ago we got to hear Stile Antico perform live in Milwaukee: Thomas Tallis’ Puer Natus Est mass interspersed with pieces from Byrd, White, and Taverner. The music was gorgeous, the voices delightful, and the artists charming and gracious.

But what really struck me was their method of decentralized coordination. Typically when we think of musical performance beyond, say, a chamber quintet, coordination involves hierarchy in the form of a conductor, to “keep everyone on the same page”. The larger the number of performers doing different things, the harder to coordinate, and therefore the greater need for a conductor … right?

Not so in this case. 13 singers, each with a particular part, bringing a distinctive element to the work. But in some ways the music is simultaneously so lush and yet so spare that if their timing is off, the beauty of the result is diminished. 13 singers with no conductor, and they coordinate by taking their visual and verbal cues from each other in a dynamic and evolutionary manner. This is a vivid example of decentralized coordination.

Of course the goal is harmony (in the general sense). If each individual acts and reacts to the actions of the other individuals in a way that produces a harmonious outcome, that’s beauty. And it’s an emergent outcome; each has his or her own score and acts accordingly, adapting to the actions of the others in a way that creates emergent harmony.

The music metaphor illustrates achieving emergent order through decentralized coordination, and it’s a metaphor for social cooperation too. Adam Smith employs the harmony metaphor for social cooperation in The Theory of Moral Sentiments, in which he invokes harmony as a desirable outcome of social interaction repeatedly (and refers to the music metaphor directly in the last reference). Note the emphasis on harmony as distinct from uniformity — each individual brings personal, private, heterogeneous features to social interaction (whether musical or economic), and they are not the same, not uniform. Each has an incentive, a desire to coordinate, to harmonize; in music it’s finding the complementary notes, in social systems it’s grounded in our innate desire for sympathy and mutual sympathy, according to Smith. Each individual brings something different to the party/performance/market.  The most beautiful and sublime outcomes emerge when each acts on its individual traits with a view toward creating harmony and sympathy. And it does not necessarily require the top-down imposition of control or system-wide hierarchy, but can be achieved through decentralized coordination.

Of course there are limits to applying the music metaphor to institutional design and social cooperation, such as the scale/number of actors. But it reminds us of the possibility of cooperation and harmony through decentralized coordination, without the need for imposed system-level control.


An example of what not to do in persuasion

Lynne Kiesling

Alex Tabarrok has an excellent post this morning at Marginal Revolution:

David Warsh and Paul Krugman try to write Hayek out of the history of macroeconomics. …

It is true that many of Hayek’s specific ideas about business cycles vanished from the mainstream discussion under the Keynesian juggernaut but what Krugman and Warsh miss is that Hayek’s vision of how to think about macroeconomics came back with a vengeance in the 1970s. …

… Hayek was an important inspiration in the modern program to build macroeconomics on microfoundations. The major connecting figure here is Lucas who cites Hayek in some of his key pieces and who long considered himself a kind of Austrian.

I offer this as a cautionary “what not to do” note to students in particular, but also to all of us. In the piece to which Alex is responding Krugman chooses his definition of “modern macroeconomics” in a way that clearly maps into his preconceptions and reflects his confirmation bias. Such a rhetorical stratagem is unscientific and anti-intellectual. It’s also easy to critique (no disrespect intended for Alex’s good, pointed critique) by simply looking at the literature and seeing that modern macro encompasses a breadth of ideas and approaches, many of which are substantially informed by models and methodological approaches that Krugman chooses to reject.

Thus both on intellectual grounds and with a view toward crafting an argument that is persuasive to those who don’t already agree with you and share your worldview, don’t do this. Being more ecumenical and treating the contributions of your intellectual opponents with respect will make your arguments more thorough, effective, and persuasive.

On a substantive note, I’d like to echo the recommendation that Jacob Levy made in the comments on Alex’s post; the conclusion of Warsh’s essay is a good one, and suggests that incorporating more of a complexity approach into macro would enable us to build better models:

That said, it is pleasing to think that Hayek himself may yet turn out to have been a very great economist after all, far more significant than Myrdal or Robinson, when seen against the background of a broader canvas. The proposition that markets are fundamentally evolutionary mechanisms runs through Hayek’s work. Caldwell, of Duke University, notes that, starting with the Constitution of Liberty, “the twin ideas of evolution and spontaneous order” become prominent, especially the idea of cultural evolution, with its emphasis on rules, norms, and decentralization.

These are today lively concepts in laboratories and universities around the world. “It could have been that Hayek was running a different race, and the fact that he didn’t do well in the Walrasian race was that he wasn’t running in it—he was running in the complexity race,” says David Colander, of Middlebury College. Hayek may yet enter history as a prophet of evolutionary economics, a discipline dreamt of since the days of Thorstein Veblen and Alfred Marshall in the late nineteenth century but not yet forged, whose great days lie ahead.

UPDATE: See also Pete Boettke on this same theme, motivated by Alex’s post.

“Death of a currency”

Lynne Kiesling

One of the great topics of discussion with my in-laws over the holidays was the impending demise of the euro, and whether there was any hope for, or reason to, maintain the euro given the sovereign fiscal challenges of the member countries. The disastrous German and Italian bond auctions, and Spain’s cancellation of its sovereign bond auction, seems to portend “eurogeddon”. One of the articles that helped me interpret these events was this column from Jeremy Warner in the Telegraph:

No, what this is about is the markets starting to bet on what was previously a minority view – a complete collapse, or break-up, of the euro. Up until the past few days, it has remained just about possible to go along with the idea that ultimately Germany would bow to pressure and do whatever might be required to save the single currency.

The prevailing view was that the German Chancellor didn’t really mean what she was saying, or was only saying it to placate German voters. When finally she came to peer over the precipice, she would retreat from her hard line position and compromise. Self interest alone would force Germany to act.

But there comes a point in every crisis where the consensus suddenly shatters. That’s what has just occurred, and with good reason. In recent days, it has become plain as a pike staff that the lady’s not for turning.

In addition to the striking parallel images of Merkel and Thatcher as women who are heads of state fighting (almost too late) for fiscal responsibility, Warner’s column does a good job of pointing to the kind of market and policy movements we can expect in the next couple of weeks. Clearly many parties behaving responsibly have already laid out some contingency plans to mitigate the effects.

But I have a simple-minded question to ask, perhaps one that I should have asked two years ago: why are so many people so worried about contagion from sovereign default in the eurozone? Should they be worried?

Typically, interconnected financial markets have negative feedback loops that lead to the dampening of propagation; price changes as investors move money around in response to changes in relative risk are an example of such a negative feedback. But with so many policies designed to insulate, protect, bail out parties, policies that introduce asymmetries by insuring against losses, have these negative feedback loops been distorted and replaced or outweighed by positive feedback loops that amplify effects? That’s how I’ve been thinking about the bailouts and subsidies and loan guarantees in both the EU and the US — policies that distort the negative feedback effects that can be equilibrating and introduce asymmetries that create destructive positive feedback effects, whereas before any disequilibrating events or shocks could have been smaller and dampened by the normal negative feedback effects in markets. So I would normally say that the forces of self-organization exist to buffer and counter the forces of contagion, but the political rules in operation have stifled the forces of self-organization and exacerbated contagion.

One of those forces of self-organization and negative feedback is bankruptcy and default, both private and sovereign. I wonder if the EU will be able to activate the salutary re-equilibrating benefits of bankruptcy and default while simultaneously being able to either stem contagion or have the political fortitude to carry on through the pain and cost that is larger than it might have been otherwise.