Members get a snapshot view of new Long Now content with easy access to all their member benefits.
Published quarterly, the member newsletter gives in-depth and behind the scenes updates on Long Now's projects.
Special updates on the 10,000 Year Clock project are posted on the members only Clock Blog.
Filmed on Monday November 23, 02015
The pundits we all listen to are no better at predictions than a “dart-throwing chimp,” and they are routinely surpassed by normal news-attentive citizens. So Philip Tetlock reported in his 02005 book, Expert Political Judgement—and in a January 02007 SALT talk.
It now turns out there are some people who are spectacularly good at forecasting, and their skills can be learned. Tetlock discovered them in the course of building winning teams for a tournament of geopolitical forecasting run by IARPA—Intelligence Advanced Research Projects Activity. His brilliant new book, SUPERFORECASTING: The Art and Science of Prediction, spells out the methodology the superforecasters developed. Like Daniel Kahneman’s THINKING, FAST AND SLOW, the book changes how we think about thinking.
Philip Tetlock is a professor at the University of Pennsylvania. With his co-researcher (and wife) Barbara Mellors he is running the Good Judgement Project, with its open competition for aspiring forecasters.
Will Syria’s President Assad still be in power at the end of next year? Will Russia and China hold joint naval exercises in the Mediterranean in the next six months? Will the Oil Volatility Index fall below 25 in 2016? Will the Arctic sea ice mass be lower next summer than it was last summer?
Five hundred such questions of geopolitical import were posed in tournament mode to thousands of amateur forecasters by IARPA—the Intelligence Advanced Research Projects Activity--between 2011 and 2015. (Tetlock mentioned that senior US intelligence officials opposed the project, but younger-generation staff were able to push it through.) Extremely careful score was kept, and before long the most adept amateur “superforecasters” were doing 30 percent better than professional intelligence officers with access to classified information. They were also better than prediction markets and drastically better than famous pundits and politicians, who Tetlock described as engaging in deliberately vague “ideological kabuki dance."
What made the amateurs so powerful was Tetlock’s insistence that they score geopolitical predictions the way meteorologists score weather predictions and then learn how to improve their scores accordingly. Meteorologists predict in percentages—“there is a 70 percent chance of rain on Thursday.” It takes time and statistics to find out how good a particular meteorologist is. If 7 out of 10 such times it in fact rained, the meteorologist gets a high score for calibration (the right percentage) and for resolution (it mostly did rain). Superforecasters, remarkably, assigned probability estimates of 72-76 percent to things that happened and 24-28 percent to things that didn’t.
How did they do that? They learned, Tetlock said, to avoid falling for the “gambler’s fallacy”—detecting nonexistent patterns. They learned objectivity—the aggressive open-mindedness it takes to set aside personal theories of public events. They learned to not overcompensate for previous mistakes—the way American intelligence professionals overcompensated for the false negative of 9/11 with the false positive of mass weapons in Saddam’s Iraq. They learned to analyze from the outside in—Assad is a dictator; most dictators stay in office a very long time; consider any current news out of Syria in that light. And they learned to balance between over-adjustment to new evidence (“This changes everything”) and under-adjustment (“This is just a blip”), and between overconfidence ("100 percent!”) and over-timidity (“Um, 50 percent”). “You only win a forecasting tournament,” Tetlock said, “by being decisive—justifiably decisive."
Much of the best forecasting came from teams that learned to collaborate adroitly. Diversity on the teams helped. One important trick was to give extra weight to the best individual forecasters. Another was to “extremize” to compensate for the conservatism of aggregate forecasts—if everyone says the chances are around 66 percent, then the real chances are probably higher.
In the Q & A following his talk Tetlock was asked if the US intelligence community would incorporate the lessons of its forecasting tournament. He said he is cautiously optimistic. Pressed for a number, he declared, “Ten years from now I would offer the probability of .7 that there will be ten times more numerical probability estimates in national intelligence estimates than there were in 2005.”
Asked about long-term forecasting, he replied, “Here’s my long-term prediction for Long Now. When the Long Now audience of 2515 looks back on the audience of 2015, their level of contempt for how we go about judging political debate will be roughly comparable to the level of contempt we have for the 1692 Salem witch trials."--Stewart Brand
Condensed ideas about long-term thinking summarized by Stewart Brand
(with Kevin Kelly, Alexander Rose and Paul Saffo) and a foreword by Brian Eno.
We would also like to recognize George Cowan (01920 - 02012) for being the first to sponsor this series.Would you like to be a featured Sponsor?
Seminars About Long-term Thinking is made possible through the generous support of The Long Now Membership and our Seminar Sponsors. We offer $5,000 and $15,000 annual Sponsorships, both of which entitle the sponsor and a guest to reserved seating at all Long Now seminars and special events. In addition, we invite $15,000 Sponsors to attend dinner with the speaker after each Seminar, and $5,000 Sponsors may choose to attend any four dinners during the sponsored year. For more information about donations and Seminar Sponsorship, please contact firstname.lastname@example.org. We are a public 501(c)(3) non-profit, and donations to us are always tax deductible.
The Long Now Foundation • Fostering Long-term Responsibility • est. 01996 Top of Page