PostCast: Generalizable Postprocessing for Precipitation Nowcasting via Unsupervised Blurriness Modeling
- URL: http://arxiv.org/abs/2410.05805v1
- Date: Tue, 8 Oct 2024 08:38:23 GMT
- Title: PostCast: Generalizable Postprocessing for Precipitation Nowcasting via Unsupervised Blurriness Modeling
- Authors: Junchao Gong, Siwei Tu, Weidong Yang, Ben Fei, Kun Chen, Wenlong Zhang, Xiaokang Yang, Wanli Ouyang, Lei Bai,
- Abstract summary: We propose an unsupervised postprocessing method to eliminate the blurriness without the requirement of training with the pairs of blurry predictions and corresponding ground truth.
A zero-shot blur kernel estimation mechanism and an auto-scale denoise guidance strategy are introduced to adapt the unconditional correlations to any blurriness modes.
- Score: 85.56969895866243
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Precipitation nowcasting plays a pivotal role in socioeconomic sectors, especially in severe convective weather warnings. Although notable progress has been achieved by approaches mining the spatiotemporal correlations with deep learning, these methods still suffer severe blurriness as the lead time increases, which hampers accurate predictions for extreme precipitation. To alleviate blurriness, researchers explore generative methods conditioned on blurry predictions. However, the pairs of blurry predictions and corresponding ground truth need to be generated in advance, making the training pipeline cumbersome and limiting the generality of generative models within blur modes that appear in training data. By rethinking the blurriness in precipitation nowcasting as a blur kernel acting on predictions, we propose an unsupervised postprocessing method to eliminate the blurriness without the requirement of training with the pairs of blurry predictions and corresponding ground truth. Specifically, we utilize blurry predictions to guide the generation process of a pre-trained unconditional denoising diffusion probabilistic model (DDPM) to obtain high-fidelity predictions with eliminated blurriness. A zero-shot blur kernel estimation mechanism and an auto-scale denoise guidance strategy are introduced to adapt the unconditional DDPM to any blurriness modes varying from datasets and lead times in precipitation nowcasting. Extensive experiments are conducted on 7 precipitation radar datasets, demonstrating the generality and superiority of our method.
Related papers
- Fast constrained sampling in pre-trained diffusion models [77.21486516041391]
Diffusion models have dominated the field of large, generative image models.
We propose an algorithm for fast-constrained sampling in large pre-trained diffusion models.
arXiv Detail & Related papers (2024-10-24T14:52:38Z) - Continuous Ensemble Weather Forecasting with Diffusion models [10.730406954385927]
Continuous Ensemble Forecasting is a novel and flexible method for sampling ensemble forecasts in diffusion models.
It can generate temporally consistent ensemble trajectories completely in parallel, with no autoregressive steps.
We demonstrate that the method achieves competitive results for global weather forecasting with good probabilistic properties.
arXiv Detail & Related papers (2024-10-07T18:51:23Z) - GPTCast: a weather language model for precipitation nowcasting [0.0]
GPTCast is a generative deep-learning method for ensemble nowcast of radar-based precipitation.
We employ a GPT model as a forecaster to learn precipitation dynamics using tokenized radar images.
arXiv Detail & Related papers (2024-07-02T09:25:58Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Fine-grained Forecasting Models Via Gaussian Process Blurring Effect [6.472434306724611]
Time series forecasting is a challenging task due to the existence of complex and dynamic temporal dependencies.
Using more training data is one way to improve the accuracy, but this source is often limited.
We are building on successful denoising approaches for image generation by advocating for an end-to-end forecasting and denoising paradigm.
arXiv Detail & Related papers (2023-12-21T20:25:16Z) - DiffCast: A Unified Framework via Residual Diffusion for Precipitation Nowcasting [20.657502066923023]
Precipitation nowcasting is an important task to predict the radar echoes sequences based on current observations, which can serve both meteorological science and smart city applications.
Previous studies address the problem either from the perspectives of deterministic modeling or probabilistic modeling.
We propose to decompose and model the chaotic evolutionary precipitation systems from the perspective of global deterministic motion and local variations with residual mechanism.
arXiv Detail & Related papers (2023-12-11T11:26:32Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - Gradpaint: Gradient-Guided Inpainting with Diffusion Models [71.47496445507862]
Denoising Diffusion Probabilistic Models (DDPMs) have recently achieved remarkable results in conditional and unconditional image generation.
We present GradPaint, which steers the generation towards a globally coherent image.
We generalizes well to diffusion models trained on various datasets, improving upon current state-of-the-art supervised and unsupervised methods.
arXiv Detail & Related papers (2023-09-18T09:36:24Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.