logo
ResearchBunny Logo
Introduction
Accurate weather forecasting is crucial for various societal aspects. Numerical weather prediction (NWP) models, such as the ECMWF's Integrated Forecast System (IFS), are widely used, but their accuracy degrades with increasing lead time due to factors like limited resolution, parameterization approximations, and the inherent chaos of the atmosphere. Ensemble prediction systems, like the ECMWF EPS, mitigate uncertainty by running multiple forecasts with perturbed initial conditions and parameterizations. Recently, machine learning (ML) models have shown promise, offering faster speeds and potentially higher accuracy than uncalibrated NWP models. However, ML models have struggled with long-term forecasts due to error accumulation. While several ML models have outperformed ECMWF HRES in 10-day forecasts, achieving comparable performance to the ECMWF ensemble in 15-day forecasts remains a challenge. This study addresses this challenge by proposing FuXi, a cascade ML model designed to reduce error accumulation and generate accurate 15-day global weather forecasts. FuXi leverages a novel cascaded architecture, utilizing pre-trained models optimized for specific forecast time windows (0-5 days, 5-10 days, and 10-15 days). The base model uses the U-Transformer architecture to effectively learn complex relationships from high-dimensional weather data. The study aims to demonstrate that FuXi achieves forecast skill comparable to the ECMWF EM for 15-day forecasts and extends the skillful forecast lead time beyond that of ECMWF HRES.
Literature Review
The paper reviews existing ML-based weather forecasting models, highlighting their strengths and limitations. It mentions WeatherBench as a benchmark dataset, and discusses previous successful models like ResNet, SwinVRNN, FourCastNet, SwinRDM, Pangu-Weather, and GraphCast. The review emphasizes the challenge of mitigating error accumulation in long-term forecasting, discussing techniques like autoregressive multi-time step loss and hierarchical temporal aggregation. The authors note the limitations of using a single model for various lead times, as performance varies across different forecast horizons. This literature review sets the stage for the introduction of FuXi as a solution that addresses the existing shortcomings.
Methodology
FuXi uses a cascade architecture composed of three pre-trained models: FuXi-Short (0-5 days), FuXi-Medium (5-10 days), and FuXi-Long (10-15 days). Each model is a variant of the U-Transformer architecture, employing Swin Transformer V2 blocks and a space-time cube embedding for efficient processing of high-dimensional weather data. The input data consists of 39 years of 6-hourly ECMWF ERA5 reanalysis data at 0.25° resolution, encompassing both surface and upper-air variables. The training process involves two steps: pre-training, where the model learns to predict a single time step using a latitude-weighted L1 loss, and fine-tuning, where each model is optimized for its specific time window using an autoregressive training regime with a curriculum learning schedule. To generate ensemble forecasts, Perlin noise is added to the initial conditions, and Monte Carlo dropout is applied to perturb model parameters. The evaluation metrics used are latitude-weighted RMSE and ACC for deterministic forecasts, and CRPS and SSR for ensemble forecasts. ECMWF HRES and EM forecasts serve as baselines for comparison, using ECMWF's internal verification method.
Key Findings
FuXi demonstrates comparable performance to the ECMWF EM in 15-day forecasts, significantly outperforming ECMWF HRES. The skillful forecast lead time (ACC > 0.6) is extended to 10.5 days for Z500 and 14.5 days for T2M. The FuXi ensemble shows comparable CRPS to the ECMWF ensemble within 9 days for Z500, T850, MSL, and T2M. While the FuXi ensemble shows some overdispersion in early lead times and underdispersion later on, overall its performance is comparable to the ECMWF ensemble. Spatial distribution of RMSE indicates that FuXi outperforms HRES in most grid points, and exhibits comparable performance to the ECMWF EM.
Discussion
The results show that the cascade ML architecture effectively reduces error accumulation in long-term weather forecasting. FuXi's superior performance compared to ECMWF HRES, and its comparable performance to the ECMWF EM for 15-day forecasts, highlight the potential of ML models for extended-range weather prediction. The extended skillful forecast lead time indicates improved accuracy and reliability. While the FuXi ensemble shows some inconsistencies in spread skill, its overall CRPS values are comparable to ECMWF, demonstrating the value of an ensemble approach even in ML-based forecasting. The success of FuXi underscores the advantages of employing a cascaded architecture and a robust base model like the U-Transformer for handling high-dimensional weather data.
Conclusion
FuXi represents a significant advancement in ML-based global weather forecasting, achieving 15-day forecasts with skill comparable to the ECMWF ensemble mean. Its cascaded architecture effectively reduces error accumulation, allowing for accurate predictions at longer lead times. Future work will explore flow-dependent perturbation methods for the ensemble, extend the model to sub-seasonal forecasting, and develop data-driven data assimilation techniques for improved initial conditions.
Limitations
The study's primary limitation lies in the reliance on ERA5 reanalysis data for training and evaluation. While ERA5 is considered high-quality, it still contains errors, and the model's performance might differ when tested against real-world observations. The use of Perlin noise for ensemble generation might not be optimal for capturing realistic forecast uncertainty, particularly for long lead times, and more sophisticated perturbation techniques are needed. The model is not yet fully end-to-end, still requiring analysis data from NWP models for initial conditions.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny