Predicting the future is not easy, there is more chance of getting it wrong than getting it right, so much is unknown or uncertain. Nevertheless we have an unquenchable thirst for forecasting the future, as any astrologer will confirm. Those who correctly predict an outcome or event are praised for their foresight, yet is this mostly luck or is there skill involved: why are some forecasters seemingly more accurate than others? The Good Judgement Project (GJP) was a four year multi-million dollar research programme to discover how the accuracy of prediction can be improved.
The findings of the research and the attributes of ‘Superforecasters’ are the basis of a book by Philip Tetlock and Dan Gardner written in 2015 under the modest title: ‘Superforecasting – the art and science of prediction’. Using 2,800 volunteers and a rolling one year timeframe respondents were asked to predict specific outcomes with short horizons of three to six months, thus accuracy could be measured once events occurred. Respondents were not experts but highly numerate individuals who understood probabilities like the subtle difference between 68% and 72% likelihood.
Respondents were given time to update their forecasts as new information became available and those who scored highest for accuracy in year one, were grouped together in year two so that a combined prediction achieved even more accuracy. The project found that intelligent and well-read amateurs could predict more accurately than professional experts, but the secret was not who they were but what they did. Tetlock identified their modus operandi as key to their success: ‘the hard work of research, the careful thought and self-criticism, the gathering and synthesising of other perspectives, the granular judgements and relentless updating’.
The findings suggest that accurate prediction does require an element of skill, and although luck will also play a part, accurate prediction requires attention to detail plus a robust approach towards uncertainty. Understanding probability to the degree necessary to be a superforecaster lies at the core of this skillset, an ability to distinguish between reducible and irreducible uncertainty. The latter is Aleatoric uncertainty that is unknowable (as distinct from Epistemic uncertainty which is knowable). The challenge for prediction into an unknowable future is to reduce uncertainty wherever possible and to recognise which parts cannot be reduced.
Psychologists have proved that the human mind craves certainty, and when it doesn’t find it, the mind imposes or creates it. This is why shareholders like a confident management team, they conflate confidence with competence. The media seek pundits to offer certainties to the public as reassurance, yet in many cases this certainty is manufactured for effect. One only has to look at how news of the Covid-19 virus outbreak has been handled by public broadcasters to see this. Certainty is what people want, uncertainty might be more honest, but it doesn’t satisfy an audience like certainty does.
Tetlock set out to prove that forecasting could be made more accurate through measurement and refinement, but he used short time frames. Everyone knows that accuracy deteriorates with time and that the further into the future one forecasts accuracy reduction is inevitable. His project ran over four years but issues were selected for their measurability so tended to be geo-political or socio-economic outcomes which would transpire in the coming months. Tetlock did not address low frequency yet high impact events, those thanks to Nassim Taleb we know call ‘Black Swans’.
The Good Judgement Project achieved its purpose of harnessing the brain-power of 2,800 individuals to forecast outcomes that proved ultimately more accurate than those of comparative professional forecasters. It was their amateur dedication to diligent research and situation analysis carried out with an open, sharing mind-set that triumphed. The feeling that gifted amateurs can succeed where the established professional have failed appeals to advisors close to our current Prime Minister. It explains why special advisors (SPADs) are often given credence over and above civil servants.
Forecasting fascinates politicians and it is no surprise that the US government sponsored the GJP – run jointly by the University of Berkeley, California and University of Pennsylvania – through its agency called IARPA (Intelligence Advanced Research Projects Authority). If forecasting can be shown to be more of a science than an art then this should hardly be a surprise. Politicians want to deliver good news – bread and circuses – so anything to help look into the future is helpful. Good foresight is worth paying for if it gives you an edge.
Written by Garry Honey, SAMI Associate and founder of Chiron Risk
The views expressed are those of the author(s) and not necessarily of SAMI Consulting.
SAMI Consulting was founded in 1989 by Shell and St Andrews University. They have undertaken scenario planning projects for a wide range of UK and international organisations. Their core skill is providing the link between futures research and strategy.
If you enjoyed this blog from SAMI Consulting, the home of scenario planning, please sign up for our monthly newsletter at firstname.lastname@example.org and/or browse our website at http://www.samiconsulting.co.uk