Foxes, Hedgehogs, and Forecasting  

Posted by T. Greer in , ,

Today I ran across Louis Menard's review of Philip Tetlock's book Expert Political Judgment: How Good Is It? How Do We Now? The bulk of Dr. Tetlock's book concerns a 18 year experiment designed to test the forecasting skills of pundits and analysts. Having gathered some 284 experts in international politics, he asked them to forecast hundreds of events in the future. As time passed he began to study the aggregate accuracy of the 28,000 hundred predictions given. Readers familiar with Nicholas Taleb's Black Swan and like books will not be surprised to hear that these experts did quite dismally. Most forecasters did no better than basic predictive algorithms; many were worse than chance. This held true regardless of a forecaster's political persuasion or level of expertise. Indeed, the more a forecaster knew about a subject the less reliable his predictions were. Dr. Tetlock's explanation for this is intriguing. To quote from the review:

Everybody's An Expert
Louis Menard. New York Review of Books. 5 December 2005.
It was no news to Tetlock, therefore, that experts got beaten by formulas. But he does believe that he discovered something about why some people make better forecasters than other people. It has to do not with what the experts believe but with the way they think. Tetlock uses Isaiah Berlin’s metaphor from Archilochus, from his essay on Tolstoy, “The Hedgehog and the Fox,” to illustrate the difference. He says:

Low scorers look like hedgehogs: thinkers who “know one big thing,” aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who “do not get it,” and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible “ad hocery” that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess.

A hedgehog is a person who sees international affairs to be ultimately determined by a single bottom-line force: balance-of-power considerations, or the clash of civilizations, or globalization and the spread of free markets. A hedgehog is the kind of person who holds a great-man theory of history, according to which the Cold War does not end if there is no Ronald Reagan. Or he or she might adhere to the “actor-dispensability thesis,” according to which Soviet Communism was doomed no matter what. Whatever it is, the big idea, and that idea alone, dictates the probable outcome of events. For the hedgehog, therefore, predictions that fail are only “off on timing,” or are “almost right,” derailed by an unforeseeable accident. There are always little swerves in the short run, but the long run irons them out.

Foxes, on the other hand, don’t see a single determining explanation in history. They tend, Tetlock says, “to see the world as a shifting mixture of self-fulfilling and self-negating prophecies: self-fulfilling ones in which success breeds success, and failure, failure but only up to a point, and then self-negating prophecies kick in as people recognize that things have gone too far.”

Tetlock did not find, in his sample, any significant correlation between how experts think and what their politics are. His hedgehogs were liberal as well as conservative, and the same with his foxes. (Hedgehogs were, of course, more likely to be extreme politically, whether rightist or leftist.) He also did not find that his foxes scored higher because they were more cautious—that their appreciation of complexity made them less likely to offer firm predictions. Unlike hedgehogs, who actually performed worse in areas in which they specialized, foxes enjoyed a modest benefit from expertise. Hedgehogs routinely over-predicted: twenty per cent of the outcomes that hedgehogs claimed were impossible or nearly impossible came to pass, versus ten per cent for the foxes. More than thirty per cent of the outcomes that hedgehogs thought were sure or near-sure did not, against twenty per cent for foxes.

The implications of Mr. Tetlock's study are worth pondering.

Humans beings naturally seek out one or two "Hedgehog" ideas to incorporate into their world narrative. This is but one destructive inclination of the many to be found in the untrained mind. Luckily, there is no reason any mind must remain untrained. Many schools and educators stress the need to develop "critical thinking skills"; helping students develop the mindset of a fox should be a central part of this development. This suggests that there is a serious flaw in the structure of modern post-secondary education, which not only allows but promotes academic specialization to the point where hedgehog attitudes become automatic. In contrast, a wide-ranging and interdisciplinary curriculum has the potential to expose students to enough contrasting viewpoints and approaches that students will easily see the folly in accepting and promoting theories fit for all sizes. 

This is not to say that educational reform will be able to completely eliminate the hedgehog pattern of thought. Certain academic disciplines promote great-theory thinking as a matter of course. Social scientists are perhaps the easiest to castigate on this count, for much their work demands that reality be simplified into simplistic and (supposedly!) predictive models or paradigms.  As long as we expect our experts to be experts in fields that value theory more than contingency, hedgehog thinking will continue to play an important part in our assessments of the future. 

Dr. Tetlock's presentation for the Long Now Foundation.

This entry was posted on 22 October, 2010 at 9:57 PM and is filed under , , . You can follow any responses to this entry through the comments feed .


Post a Comment