A cloudless future? The Mystery at the Heart of Climate Predictions – YubaNet

0



We hear a lot about how climate change will alter land, sea and ice. But how will it affect the clouds?

“Low clouds could dry up and shrink like the ice sheets,” says Michael Pritchard, professor of Earth system science at UC Irvine. “Or they could get thicker and more reflective.”

These two scenarios would result in very different future climate zones. And that, says Pritchard, is part of the problem.

“If you ask two different climate models what the future will be like if we add a lot more CO2, you will get two very different answers. And the main reason for that is the way clouds are included in climate models.”

Flat clouds formed by fine-scale vortices as observed in nature. Researchers are using advanced computing to add higher-resolution cloud dynamics to global simulations.

No one disputes that clouds and aerosols – bits of soot and dust that form cloud droplets – are an important part of the climate equation. The problem is that these phenomena occur on a length and time scale that today’s models can’t even come close to reproducing. They are therefore included in models through a variety of approximations.

Analysis of global climate models consistently shows that clouds are the greatest source of uncertainty and instability.

COMMUNITY CODES RETROFIT

While the US’s most advanced global climate model is struggling to approach a global resolution of 4 kilometers, Pritchard estimates models need a resolution of at least 100 meters to capture the fine-scale turbulent vortices that form flat cloud systems — 40 times higher dissolved in every direction . According to Moore’s Law, it could take as long as 2060 before the computing power is available to capture this level of detail.

Pritchard is working to close this glaring gap by dividing the climate modeling problem into two parts: a lower-resolution (100 km) coarse-grained planetary model and many small patches with a resolution of 100 to 200 meters. The two simulations run independently and then exchange data every 30 minutes to ensure no simulation goes off course or becomes unrealistic.

His team reported on the results of these efforts in Journal of Advances in Modeling Earth Systems in April 2022. The research is supported by grants from the National Science Foundation (NSF) and the Department of Energy (DOE).

Dubbed the Multiscale Modeling Framework (MMF), this climate simulation method has been around since 2000 and has long been an option within the Community Earth System Model (CESM) model developed at the National Center for Atmospheric Research. The idea has recently had a renaissance at the Department of Energy, where researchers on the Energy Exascale Earth System Model (E3SM) have pushed it to new computational limits as part of the Exascale Computing Project. Pritchard’s co-author Walter Hannah of Lawrence Livermore National Laboratory is helping to guide this effort.

“The model runs a final run around the most difficult problem – modeling the entire planet,” explained Pritchard. “It has thousands of small micromodels that capture things like realistic flat cloud formations, which only come out at very high resolution.”

“The Multiscale Modeling Framework approach is also ideal for DOE’s upcoming GPU-based exascale computers,” said Mark Taylor, chief computational scientist for DOE’s Energy Exascale Earth System Model (E3SM) project and research scientist at Sandia National Laboratories. “Each GPU has the power to run hundreds of micromodels while achieving the throughput of the lower-resolution coarse-grained planetary model.”

Pritchard’s research and new approach are made possible in part by the NSF-funded Frontera supercomputer at the Texas Advanced Computing Center (TACC). As the world’s fastest university supercomputer, Pritchard can run his models on Frontera at a time and length scale only accessible on a handful of systems in the US and test their potential for cloud modeling.

“We devised a way for a supercomputer to best split the work of simulating cloud physics among different parts of the world that deserve different resolutions…to make it run much faster,” the team wrote.

Simulating the atmosphere in this way provides Pritchard with the resolution needed to capture the physical processes and turbulent vortices involved in cloud formation. The researchers showed that the multi-model approach produced no undesirable side effects even where patches with different cloud-resolving grid structures met.

“We were glad the differences were small,” he said. “This will bring a new level of flexibility to all climate model users who want to focus high resolution on different locations.”

Untangling and reconnecting the various scales of the CESM model was a challenge that Pritchard’s team mastered. Another involved reprogramming the model to take advantage of the ever-growing number of processors available on modern supercomputing systems.

Pritchard and his team — UCI postdoctoral fellow Liran Peng and University of Washington research scientist Peter Blossey — tackled this problem by breaking down the inner domains of CESM’s embedded cloud models into smaller pieces that run in parallel with MPI or Message Passing Interface – a way to exchange messages between multiple computers running a parallel program across distributed memory – and orchestrate those computations in a way that uses many more processors.

“This already seems to offer four times the acceleration with great efficiency. That means I can be four times as ambitious for my cloud resolution models,” he said. “I’m really optimistic that this dream of regionalization and MPI decomposition will lead to a completely different landscape of what’s possible.”

CLOUDS FOR MACHINE LEARNING

Pritchard sees another promising approach in machine learning, which his team has been researching since 2017. “I was very provoked by how efficiently a dumb sheet of neurons can reproduce these partial differential equations,” says Pritchard.

Pritchard’s research and new approach are made possible in part by the NSF-funded Frontera supercomputer at TACC. Pritchard, the world’s fastest university supercomputer, can run its models on Frontera at a time and length scale only accessible on a handful of systems in the US and test their potential for cloud modeling.

In a paper submitted last fall, Pritchard, lead author Tom Beucler of UCI, and others describe a machine learning approach that successfully predicts atmospheric conditions even in climate regimes where it hasn’t been trained, where others have struggled.

This “climate-invariant” model integrates physical knowledge about climate processes into the machine learning algorithms. Their study — which used Stampede2 at TACC, Cheyenne at the National Center for Atmospheric Research, and Expanse at the San Diego Supercomputer Center — showed that the machine learning method can maintain high accuracy across a wide range of climates and regions.

“If high-resolution cloud physics through machine learning were ever to succeed, it would change everything about how we do climate simulations,” Pritchard said. “I’m interested in how reproducible and reliable the machine learning approach can be in complex environments.”

Pritchard is well placed to do that. He is a member of the Executive Committee of the NSF Center for Learning the Earth with Artificial Intelligence and Physics or LEAP – a new science and technology center funded by NSF in 2021 and led by his longtime collaborator on the subject, Professor Pierre Gentine. LEAP brings together climate and data scientists to narrow the range of uncertainty in climate modeling and provide more accurate and actionable climate projections that have immediate societal impacts.

“All of the research I’ve done so far I would describe as ‘throughput limited,'” Pritchard said. “My task was to create 10 to 100 year simulations. That limited all my grid options. But if the goal is to create short simulations to train machine learning models, that’s a different landscape.”

Pritchard hopes to soon be able to use the results of his embedded 50-meter models to begin building a large training library. “It’s a really nice data set for machine learning.”

But will AI mature fast enough? Time is of the essence in discovering the fate of clouds.

“As these clouds shrink like ice sheets, exposing darker surfaces, it will increase global warming and all the dangers it poses. But if they do the opposite of ice sheets and thicken what they could, that’s less dangerous. Some have assessed this as a multi-trillion dollar problem for society. And that’s been questioned for a long time,” Pritchard said.

Simulation after simulation, government-funded supercomputers are helping Pritchard and others get closer to the answer to this critical question.

“I’m torn between genuine gratitude for the US national computing infrastructure that is so incredibly helpful in helping us develop and operate climate models,” Pritchard said, “and feeling that we need new federal funding at the Manhattan level.” project and cross-agency coordination need to actually solve this problem.”

The research was funded by the National Science Foundation (NSF) Climate and Large Dynamics Program under grants AGS-1912134 and AGS-1912130; and under the auspices of the US Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344. The research used computing resources from the Texas Advanced Computing Center and the Extreme Science and Engineering Discovery Environment (XSEDE), supported by NSF grant ACI-1548562.

DIARY: Journal of Advances in Modeling Earth Systems: Load balancing calculations for intensive physics to embed regionalized high-resolution cloud resolution models into the E3SM and CESM climate models

DOI

10.1029/2021MS002841

Share.

Comments are closed.