Generative AI Tracks Typhoons, Tames Energy Use

Weather forecasters in Taiwan had their hair blown back when they saw a typhoon up close, created on a computer that slashed the time and energy needed for the job.

It’s a reaction that users in many fields are feeling as generative AI shows them how new levels of performance contribute to reductions in total cost of ownership.

Inside the AI of the Storm

Tracking a typhoon provided a great test case of generative AI’s prowess. The work traditionally begins with clusters of CPUs cranking on complex algorithms to create atmospheric models with a 25-kilometer resolution.

Enter CorrDiff, a generative AI model that’s part of NVIDIA Earth-2, a set of services and software for weather and climate research.

Using a class of diffusion models that power today’s text-to-image services, CorrDiff resolved the 25-km models to two kilometers 1,000x faster, using 3,000x less energy for a single inference than traditional methods.

CorrDiff Cuts Costs 50x, Energy Use 25x

CorrDiff shines on the NVIDIA AI platform, even when retraining the model once a year and using statistical groups of a thousand forecasts to boost the accuracy of predictions. Compared to traditional methods under these conditions, it slashes cost by 50x cost and energy use by 25x a year.

That means work that used to require nearly $3 million for a cluster of CPUs and the energy to run them can be done for about $60,000 on a single system with an NVIDIA H100 Tensor Core GPU. It’s a massive reduction that shows how generative AI and accelerated computing increases energy efficiency and lowers total cost of ownership.

The technology also helps forecasters see more precisely where a typhoon will land, potentially saving lives.

“NVIDIA’s CorrDiff generative AI model opens the door to the use of AI-generated kilometer-scale weather forecasts, enabling Taiwan to prepare better for typhoons,” said Hongey Chen, a director of Taiwan’s National Science and Technology Center for Disaster Reduction.

The Taiwan forecasters could save nearly a gigawatt-hour a year, using CorrDiff. Energy savings could balloon if the nearly 200 regional weather data centers around the world adopt the technology for more sustainable computing.

Companies that sell commercial forecasts are also adopting CorrDiff, attracted by its speed and savings.

Broad Horizons for Energy Efficiency

NVIDIA Earth-2 takes these capabilities to a planetary scale. It fuses AI, physics simulations and observed data to help countries and companies respond to global issues like climate change. That will help address the impacts of climate change, which is expected to cost a million lives and $1.7 trillion per year by 2050.

Accelerated computing and generative AI are bringing new levels of performance and energy efficiency to many applications. Explainers on green computing and why GPUs are great for AI provide more context and some examples.

Compare the costs and energy consumption of popular workloads running on an x86 CPU-based server versus an NVIDIA GPU server with this simple calculator. And watch Huang’s keynote address at COMPUTEX to get the big picture.

Leave a Comment

Your email address will not be published. Required fields are marked *