Data-Efficient Ensemble Weather Forecasting with Diffusion Models
- URL: http://arxiv.org/abs/2509.11047v1
- Date: Sun, 14 Sep 2025 02:22:16 GMT
- Title: Data-Efficient Ensemble Weather Forecasting with Diffusion Models
- Authors: Kevin Valencia, Ziyang Liu, Justin Cui,
- Abstract summary: diffusion models are typically autoregressive and are thus computationally expensive.<n>This is a challenge in climate science, where data can be limited, costly, or difficult to work with.<n>We evaluate several data sampling strategies and show that a simple time stratified sampling approach achieves performance similar to or better than full-data training.
- Score: 5.03317364227682
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although numerical weather forecasting methods have dominated the field, recent advances in deep learning methods, such as diffusion models, have shown promise in ensemble weather forecasting. However, such models are typically autoregressive and are thus computationally expensive. This is a challenge in climate science, where data can be limited, costly, or difficult to work with. In this work, we explore the impact of curated data selection on these autoregressive diffusion models. We evaluate several data sampling strategies and show that a simple time stratified sampling approach achieves performance similar to or better than full-data training. Notably, it outperforms the full-data model on certain metrics and performs only slightly worse on others while using only 20% of the training data. Our results demonstrate the feasibility of data-efficient diffusion training, especially for weather forecasting, and motivates future work on adaptive or model-aware sampling methods that go beyond random or purely temporal sampling.
Related papers
- Nonparametric Data Attribution for Diffusion Models [57.820618036556084]
Data attribution for generative models seeks to quantify the influence of individual training examples on model outputs.<n>We propose a nonparametric attribution method that operates entirely on data, measuring influence via patch-level similarity between generated and training images.
arXiv Detail & Related papers (2025-10-16T03:37:16Z) - Diffusion-Based Generation and Imputation of Driving Scenarios from Limited Vehicle CAN Data [13.575299934411978]
Diffusion models have shown to be effective to generate realistic and synthetic data.<n>We propose a hybrid generative approach that combines autoregressive and non-autoregressive techniques.<n>Our best model is able to outperform even the training data in terms of physical correctness, while showing plausible driving behavior.
arXiv Detail & Related papers (2025-09-15T19:07:28Z) - Breaking Silos: Adaptive Model Fusion Unlocks Better Time Series Forecasting [64.45587649141842]
Time-series forecasting plays a critical role in many real-world applications.<n>No single model consistently outperforms others across different test samples, but instead (ii) each model excels in specific cases.<n>We introduce TimeFuse, a framework for collective time-series forecasting with sample-level adaptive fusion of heterogeneous models.
arXiv Detail & Related papers (2025-05-24T00:45:07Z) - Adaptive Non-Uniform Timestep Sampling for Diffusion Model Training [4.760537994346813]
As data distributions grow more complex, training diffusion models to convergence becomes increasingly intensive.
We introduce a non-uniform timestep sampling method that prioritizes these more critical timesteps.
Our method shows robust performance across various datasets, scheduling strategies, and diffusion architectures.
arXiv Detail & Related papers (2024-11-15T07:12:18Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - ReAugment: Model Zoo-Guided RL for Few-Shot Time Series Augmentation and Forecasting [74.00765474305288]
We present a pilot study on using reinforcement learning (RL) for time series data augmentation.<n>Our method, ReAugment, tackles three critical questions: which parts of the training set should be augmented, how the augmentation should be performed, and what advantages RL brings to the process.
arXiv Detail & Related papers (2024-09-10T07:34:19Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Learning Defect Prediction from Unrealistic Data [57.53586547895278]
Pretrained models of code have become popular choices for code understanding and generation tasks.
Such models tend to be large and require commensurate volumes of training data.
It has become popular to train models with far larger but less realistic datasets, such as functions with artificially injected bugs.
Models trained on such data tend to only perform well on similar data, while underperforming on real world programs.
arXiv Detail & Related papers (2023-11-02T01:51:43Z) - Precipitation nowcasting with generative diffusion models [0.0]
We study the efficacy of diffusion models in handling the task of precipitation nowcasting.
Our work is conducted in comparison to the performance of well-established U-Net models.
arXiv Detail & Related papers (2023-08-13T09:51:16Z) - BOOT: Data-free Distillation of Denoising Diffusion Models with
Bootstrapping [64.54271680071373]
Diffusion models have demonstrated excellent potential for generating diverse images.
Knowledge distillation has been recently proposed as a remedy that can reduce the number of inference steps to one or a few.
We present a novel technique called BOOT, that overcomes limitations with an efficient data-free distillation algorithm.
arXiv Detail & Related papers (2023-06-08T20:30:55Z) - A case study of spatiotemporal forecasting techniques for weather forecasting [4.347494885647007]
The correlations of real-world processes aretemporal, and the data generated by them exhibits both spatial and temporal evolution.
Time series-based models are a viable alternative to numerical forecasts.
We show that decompositiontemporal prediction models reduced computational costs while improving accuracy.
arXiv Detail & Related papers (2022-09-29T13:47:02Z) - Long-term stability and generalization of observationally-constrained
stochastic data-driven models for geophysical turbulence [0.19686770963118383]
Deep learning models can mitigate certain biases in current state-of-the-art weather models.
Data-driven models require a lot of training data which may not be available from reanalysis (observational data) products.
deterministic data-driven forecasting models suffer from issues with long-term stability and unphysical climate drift.
We propose a convolutional variational autoencoder-based data-driven model that is pre-trained on an imperfect climate model simulation.
arXiv Detail & Related papers (2022-05-09T23:52:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.