Generative Data Intelligence

Why we still need a CERN for climate change – Physics World

Date:

Tim Palmer says that we must pool our resources to produce high-resolution climate models that societies can use, before it is too late

<a href="https://coingenius.news/wp-content/uploads/2024/04/why-we-still-need-a-cern-for-climate-change-physics-world-1.jpg" data-fancybox data-src="https://coingenius.news/wp-content/uploads/2024/04/why-we-still-need-a-cern-for-climate-change-physics-world-1.jpg" data-caption="Worrying trend Reliable climate models are needed so that societies can adapt to the impact of climate change. (Courtesy: Shutterstock/Migel)”>
A flood-destroyed street in Morocco
Worrying trend Reliable climate models are needed so that societies can adapt to the impact of climate change. (Courtesy: Shutterstock/Migel)

It was a scorcher last year. Land and sea temperatures were up to 0.2 °C higher every single month in the second half of 2023, with these warm anomalies continuing into 2024. We know the world is warming, but the sudden heat spike had not been predicted. As NASA climate scientist Gavin Schmidt wrote in Nature recently: “It’s humbling and a bit worrying to admit that no year has confounded climate scientists’ predictive capabilities more than 2023 has.”

As Schmidt went on to explain, a spell of record-breaking warmth had been deemed “unlikely” despite 2023 being an El Niño year, where the relatively cool waters in the central and eastern equatorial Pacific Ocean are replaced with warmer waters. Trouble is, the complex interactions between atmospheric deep convection and equatorial modes of ocean variability, which lie behind El Niño, are poorly resolved in conventional climate models.

Our inability to simulate El Niño properly with current climate models (J. Climate 10.1175/JCLI-D-21-0648.1) is symptomatic of a much bigger problem. In 2011 I argued that contemporary climate models were not good enough to simulate the changing nature of weather extremes such as droughts, heat waves and floods (see “A CERN for climate change” March 2011 p13). With grid-point spacings typically around 100 km, these models provide a blurred, distorted vision of the future climate. For variables like rainfall, the systematic errors associated with such low spatial resolution are larger than the climate-change signals that the models attempt to predict.

Reliable climate models are vitally required so that societies can adapt to climate change, assess the urgency of reaching net-zero or implement geoengineering solutions if things get really bad. Yet how is it possible to adapt if we don’t know whether droughts, heat waves, storms or floods cause the greater threat? How do we assess the urgency of net-zero if models cannot simulate “tipping” points? How is it possible to agree on potential geoengineering solutions if it is not possible to reliably assess whether spraying aerosols in the stratosphere will weaken the monsoons or reduce the moisture supply to the tropical rainforests? Climate modellers have to take the issue of model inadequacy much more seriously if they wish to provide society with reliable actionable information about climate change.

I concluded in 2011 that we needed to develop global climate models with spatial resolution of around 1 km (with compatible temporal resolution) and the only way to achieve this is to pool human and computer resources to create one or more internationally federated institutes. In other words, we need a “CERN for climate change” – an effort inspired by the particle-physics facility near Geneva, which has become an emblem for international collaboration and progress.

That was 13 years ago and since then nature has spoken with a vengeance. We have seen unprecedented heat waves, storms and floods, so much so that the World Economic Forum rated “extreme weather” as the most likely global event to trigger an economic crisis in the coming years. As prominent climate scientist Michael Mann noted in 2021 following a devastating flood in Northern Europe: “The climate-change signal is emerging from the noise faster than the models predicted.” That view was backed by a briefing note from the Royal Society for the COP26 climate-change meeting held in Glasgow in 2021, which stated that the inability to simulate physical processes in fine detail accounts for “the most significant uncertainties in future climate, especially at the regional and local levels”.

Yet modelling improvements have not kept pace with the changing nature of these real-world extremes. While many national climate modelling centres have finally started work on high-resolution models, on current trends it will take until the second half of the century to reach kilometre-scale resolution. This will be too late for it be useful to tackle climate change (see figure below) and urgency is needed more than ever.

A climate EVE

Pooling human and computing resources internationally is a solution that seems obvious. In a review of UK science in 2023, the Nobel-prize winner Paul Nurse commented that “there are research areas of global strategic importance where new multi-nationally funded institutes or international research infrastructures could be contemplated, an obvious example being an institute of climate change built on the EMBL [European Molecular Biology Laboratory] model”. He added that “such institutes are powerful tools for multinational collaboration and bring great benefit not only internationally but also for the host nation”.

So, why hasn’t it happened? Some say that we don’t need more science and instead must spend money helping those that are already suffering from climate change. That is true, but computer models have helped vulnerable societies massively over the years. Before the 1980s, poorly forecast tropical cyclones could kill hundreds of thousands of people in vulnerable societies. Now, with improved model resolution, excellent week-ahead predictions (and the ability to communicate the forecasts) can be made and it is rare for more than a few tens of people to be killed by extreme weather.

<a data-fancybox data-src="https://coingenius.news/wp-content/uploads/2024/04/why-we-still-need-a-cern-for-climate-change-physics-world.png" data-caption="Too little, too late Based on current trends, global climate models that are used in Intergovernmental Panel on Climate Change climate assessment reports will only have a resolution of a few kilometres by 2055. (Redrawn from original by Andreas Prein, National Center for Atmospheric Research) ” title=”Click to open image in popup” href=”https://coingenius.news/wp-content/uploads/2024/04/why-we-still-need-a-cern-for-climate-change-physics-world.png”>Graph of the spatial resolution climate models decreasing over time

High-resolution climate models will help target billions of investment dollars to allow vulnerable societies to become resilient to regionally specific types of future extreme weather. Without this information, governments could squander vast amounts of money on maladaptation. Indeed, scientists from the global south already complain that they don’t have actionable information from contemporary models to make informed decisions.

Others say that different models are necessary so that when they all agree, we can be confident in their predictions. However, the current generation of climate models is not diverse at all. They all assume that critically important sub-grid climatic processes like deep convection, flow over orography and ocean mixing by mesoscale eddies can be parametrized by simple formulae. This assumption is false and is the origin of common systematic errors in contemporary models. It is better to represent model uncertainty with more scientifically sound methodologies.

A shift, however, could be on the horizon. Last year a climate-modelling summit was held in Berlin to kick-start the international project Earth Visualisation Engines (EVE). It aims to not only create high-resolution models but also foster collaboration between scientists from the global north and south to work together to obtain accurate, reliable and actionable climate information.

Like the EMBL, it is planned that EVE will comprise a series of highly interconnected nodes, each with dedicated exascale computing capability, serving all of global society. The funding for each node – about $300m per year – is small compared to the trillions of dollars of loss and damage that climate change will cause.

Hopefully, in another 13 years’ time EVE or something similar will be producing the reliable climate predictions that societies around the globe now desperately need. If not, then I fear it will be too late.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?