Latent diffusion models for generative precipitation nowcasting with
accurate uncertainty quantification
- URL: http://arxiv.org/abs/2304.12891v1
- Date: Tue, 25 Apr 2023 15:03:15 GMT
- Title: Latent diffusion models for generative precipitation nowcasting with
accurate uncertainty quantification
- Authors: Jussi Leinonen, Ulrich Hamann, Daniele Nerini, Urs Germann, Gabriele
Franch
- Abstract summary: We introduce a latent diffusion model (LDM) for precipitation nowcasting - short-term forecasting based on the latest observational data.
We benchmark it against the GAN-based Deep Generative Models of Rainfall (DGMR) and a statistical model, PySTEPS.
The clearest advantage of the LDM is that it generates more diverse predictions than DGMR or PySTEPS.
- Score: 1.7718093866806544
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Diffusion models have been widely adopted in image generation, producing
higher-quality and more diverse samples than generative adversarial networks
(GANs). We introduce a latent diffusion model (LDM) for precipitation
nowcasting - short-term forecasting based on the latest observational data. The
LDM is more stable and requires less computation to train than GANs, albeit
with more computationally expensive generation. We benchmark it against the
GAN-based Deep Generative Models of Rainfall (DGMR) and a statistical model,
PySTEPS. The LDM produces more accurate precipitation predictions, while the
comparisons are more mixed when predicting whether the precipitation exceeds
predefined thresholds. The clearest advantage of the LDM is that it generates
more diverse predictions than DGMR or PySTEPS. Rank distribution tests indicate
that the distribution of samples from the LDM accurately reflects the
uncertainty of the predictions. Thus, LDMs are promising for any applications
where uncertainty quantification is important, such as weather and climate.
Related papers
- Series-to-Series Diffusion Bridge Model [8.590453584544386]
We present a comprehensive framework that encompasses most existing diffusion-based methods.
We propose a novel diffusion-based time series forecasting model, the Series-to-Series Diffusion Bridge Model ($mathrmS2DBM$)
Experimental results demonstrate that $mathrmS2DBM$ delivers superior performance in point-to-point forecasting.
arXiv Detail & Related papers (2024-11-07T07:37:34Z) - Masked Diffusion Models are Secretly Time-Agnostic Masked Models and Exploit Inaccurate Categorical Sampling [47.82616476928464]
Masked diffusion models (MDMs) have emerged as a popular research topic for generative modeling of discrete data.
We show that both training and sampling of MDMs are theoretically free from the time variable.
We identify, for the first time, an underlying numerical issue, even with the commonly used 32-bit floating-point precision.
arXiv Detail & Related papers (2024-09-04T17:48:19Z) - SRNDiff: Short-term Rainfall Nowcasting with Condition Diffusion Model [5.21064926344773]
We introduce the diffusion model to the precipitation forecasting task.
We propose a short-term precipitation nowcasting with condition diffusion model based on historical observational data.
By incorporating an additional conditional decoder module in the denoising process, SRNDiff achieves end-to-end conditional rainfall prediction.
arXiv Detail & Related papers (2024-02-21T12:06:06Z) - Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - GDTS: Goal-Guided Diffusion Model with Tree Sampling for Multi-Modal Pedestrian Trajectory Prediction [15.731398013255179]
We propose a novel Goal-Guided Diffusion Model with Tree Sampling for multi-modal trajectory prediction.
A two-stage tree sampling algorithm is presented, which leverages common features to reduce the inference time and improve accuracy for multi-modal prediction.
Experimental results demonstrate that our proposed framework achieves comparable state-of-the-art performance with real-time inference speed in public datasets.
arXiv Detail & Related papers (2023-11-25T03:55:06Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Preconditioned Score-based Generative Models [49.88840603798831]
An intuitive acceleration method is to reduce the sampling iterations which however causes severe performance degradation.
We propose a model-agnostic bfem preconditioned diffusion sampling (PDS) method that leverages matrix preconditioning to alleviate the aforementioned problem.
PDS alters the sampling process of a vanilla SGM at marginal extra computation cost, and without model retraining.
arXiv Detail & Related papers (2023-02-13T16:30:53Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Mixture Density Conditional Generative Adversarial Network Models
(MD-CGAN) [1.0312968200748118]
We present the Mixture Density Generative Adversarial Model (MD-CGAN) with a focus on time series forecasting.
By using a Gaussian mixture model as the output distribution, MD-CGAN offers posterior predictions that are non-Gaussian.
arXiv Detail & Related papers (2020-04-08T03:55:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.