Limits to modelling

Limits to modelling

September 2023 was the hottest September on record, following the hottest August and the hottest July.  It wasn’t just a bit hotter – it was fully 0.5˚˚ C hotter, a massive leap, the largest jump in temperature ever seen. It was 1.8˚ C hotter than pre-industrial levels. In the UK, there were seven consecutive days with temperatures over 30˚ C for the first time ever.

Scientists were shocked. “September was, in my professional opinion as a climate scientist, absolutely gobsmackingly bananas,” said Zeke Hausfather, at the Berkeley Earth climate data project. Others commented: “struggling to comprehend how a single year can jump so much”; “unprecedented”; “extraordinary”.

But hundreds of scientists have been working very hard for years trying to forecast global warming, so what happened?  The IPCC does seriously in-depth modelling, with lots of probability assessments so why did it not anticipate this increase? Their central estimate of global surface air temperature (GSAT) crossing the 1.5°C threshold lies in the early 2030. By 2030, GSAT in any individual year could exceed 1.5°C relative to 1850–1900 with a likelihood between 40% and 60%.

The IPCC recognises that modelling is not an exact science – which is why it runs lots of probability models – and that forecasts for any individual month may easily be wide of the mark.  There were some factors in September that made the temperature leap higher. We are in part of the sporadic El Niño climate pattern where heat is released from the oceans. There is an uptick in the 11-year solar cycle; a volcanic eruption in Tonga released a large amount of water vapour, which traps heat; and, perversely, the cuts in sun-blocking sulphur emissions from shipping and industry don’t help either.  Nonetheless, scientists were still surprised at the scale of increase.

There are two fundamental challenges with climate modelling –the system and the data. Fully modelling all the interactions within the climate system is virtually impossible, and all the models have to rely on assumptions. The system itself is unstable and very sensitive to initial conditions. So it also virtually impossible to get accurate enough data, from all around the world,  both about the current state of the system and the geographic factors (eg ocean currents) affecting the dynamics.

In addition there are multiple feedback loops, both positive and negative. For example, melting of polar ice caps means the earth is darker and so absorbs more of the sun’s heat, driving further melting.  Scientists have long fretted over climate tipping points, where non-linear dynamics lead to run-away global warming.

The climate has long been regarded as a classic example of chaos theory and the butterfly effect.  Although it is assumed that the system is deterministic, the non-linearity and sensitivity to initial conditions makes modelling impossible.

So, should we give up on modelling?  And if all we can say is that there is an x% probability of passing a global warming threshold, does that help us at all?

The IPCC models are good enough to give justifiable predictions of overall global warming and to demonstrate that greenhouse gas emissions are clearly a significant cause. This means we can move the debate on to how reduce the. In that sense they have served a purpose, and extreme examples like September can add further to the pressure to act.

But knowing there is a 10% chance of life-threatening change is not especially helpful. We really need to know what to do in such circumstances – to build scenarios for action, and develop resilience to the extremes.

This problem will apply to many other systems if they are sufficiently complex. Certainly any system including the actions of people within it – political, economic and social dynamics, even the pace and directions of technology development – is going to be effectively indeterminate.  In nearly all our scenario building work we view the rate of climate change and society’s reaction to it as critical uncertainties.

One example of that is our work on Sustainability Innovation Pathways. By combining qualitative scenarios with quantitative analysis, the SIP Framework is able to account for the richness of the many possible future net zero worlds while considering alternative approaches and therefore avoiding the danger of single point forecasts.

This ensures that a wide range of futures is considered, whilst providing the quantitative reassurance that companies, investors and governments like to see when determining funding strategy.

Written by Huw Williams, SAMI Principal

The views expressed are those of the author(s) and not necessarily of SAMI Consulting.

Achieve more by understanding what the future may bring. We bring skills developed over thirty years of international and national projects to create actionable, transformative strategy. Futures, foresight and scenario planning to make robust decisions in uncertain times. Find out more at

If you enjoyed this blog from SAMI Consulting, the home of scenario planning, please sign up for our monthly newsletter at and/or browse our website at

Featured image by Marcus Friedrich from Pixabay

Leave a reply

Your email address will not be published. Required fields are marked *