logo

55 pages 1 hour read

Philip E. Tetlock, Dan Gardner

Superforecasting: The Art and Science of Prediction

Nonfiction | Book | Adult | Published in 2015

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Summary and Study Guide

Overview

Superforecasting: The Art and Science of Prediction (2015) was co-authored by Good Judgment Project (GJP) co-leader Philip Tetlock and psychology and decision-making writer Dan Gardner. It continues on from Tetlock’s 2005 book Expert Political Judgment: How Good Is It? How Can We Know?, which analyzed predictions made by expert political scientists and concluded that such predictions were highly unreliable in their accuracy.

Superforecasting draws upon the work of the GJP, a multiyear forecasting study Tetlock co-launched in 2011 with Barbara Mellers. The GJP was funded by the Intelligence Advanced Research Projects Activity (IARPA), which is part of the American intelligence community. The GJP recruited 2,800 volunteers and asked them to estimate the probability that highly specific world events would occur. Working both as individuals and in teams, the volunteers used data, critical thinking, and methods of data aggregation to reach their predictions. Tetlock discovered that some volunteers vastly outperformed their peers, despite not being experts in the subject. He dubbed these gifted individuals “superforecasters” and set out to discover what distinguished their approaches to prediction. The book explores hedgehoglike and foxlike forecasting, whether forecasting is art or science, and the role that doubt can play in forecasting.

New York Times reviewer Leonard Mlodinow asserts that the lessons gleaned about what distinguishes superforecasters “are hardly surprising, though the accuracy that ordinary people regularly attained through their meticulous application did amaze me” (Mlodinow, Leonard. “‘Mindware’ and ‘Superforecasting.’The New York Times, 15 Oct. 2015). Mlodinow adds that superforecasters’ approaches offer a model for how to “understand and react more intelligently to the confusing world around us.”

This guide uses the 2015 Cornerstone Digital Kindle edition.

Summary

Superforecasting: The Art and Science of Prediction begins by defining what a forecast is and showing that everyone is a forecaster in some capacity, as our predictions about different facets of our day influence our decisions. However, the book asserts that so-called experts’ predictions are accurate about as reliably as a dart-throwing chimp hits its target (i.e., the success rate is no better than random).

Still, since Tetlock’s initiation of the Good Judgment Project in 2011, he has come across “superforecasters” who stand out for their ability to successfully predict outcomes. Chapter by chapter, the book assesses the character traits and approaches that distinguish superforecasters. For example, superforecasters conform to ancient Greek thinker Archilochus’s definition of a foxlike personality, which knows many little things, rather than a hedgehoglike personality, which knows only one big thing. While the media favors hedgehog types because they are opinionated and controversial, hedgehogs are often poorer forecasters than foxes, who are more open to learning and receiving new evidence that contradicts their preconceived notions.

The authors investigate whether superforecasters are advantaged by having above-average intelligence, numerical ability, and interest in news and world events. In each case, the answer is yes, but only moderately. Throughout their analysis, the authors make the case for using numerical data both to quantify the likelihood of an event and to assess the performance of any particular forecaster. The authors believe such quantification and measurement would avoid the vagueness that has kept forecasting mediocre for so long.

The authors conclude that the personality trait that most distinguishes superforecasters is a commitment to continual self-improvement and learning from their mistakes. Superforecasters are eager to perform autopsies on their failures and to pinpoint exactly why and how their predictions went awry. This postmortem practice ensures that they adjust their approach. The authors believe another factor in great forecasting is teamwork, as the most successful forecasts often aggregate numerous people’s predictions.

The last chapters address forecasting’s critics and reinforce the importance of a forecaster’s humility and respect for uncertainty, even within areas such as leadership where decisiveness and confidence are valued. The authors predict that forecasting will improve if public bodies demand evidence-based forecasts, just as those organizations expect evidence-based medical trials. However, the authors acknowledge that some institutions, including political parties, use forecasts not for the purposes of accuracy but to promote their cause. In this case, the inaccuracy of current forecasting serves their purpose. The book suggests improving the forecasting process by pairing superforecasters with “superquestioners” and offers “Ten Commandments for Aspiring Superforecasters.”

blurred text
blurred text
blurred text
blurred text
blurred text
blurred text
blurred text
blurred text