SynCast: Synergizing Contradictions in Precipitation Nowcasting via Diffusion Sequential Preference Optimization
- URL: http://arxiv.org/abs/2510.21847v1
- Date: Wed, 22 Oct 2025 16:11:22 GMT
- Title: SynCast: Synergizing Contradictions in Precipitation Nowcasting via Diffusion Sequential Preference Optimization
- Authors: Kaiyi Xu, Junchao Gong, Wenlong Zhang, Ben Fei, Lei Bai, Wanli Ouyang,
- Abstract summary: We introduce preference optimization into precipitation nowcasting for the first time, motivated by the success of reinforcement learning from human feedback in large language models.<n>In the first stage, the framework focuses on reducing FAR, training the model to effectively suppress false alarms.
- Score: 62.958457694151384
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Precipitation nowcasting based on radar echoes plays a crucial role in monitoring extreme weather and supporting disaster prevention. Although deep learning approaches have achieved significant progress, they still face notable limitations. For example, deterministic models tend to produce over-smoothed predictions, which struggle to capture extreme events and fine-scale precipitation patterns. Probabilistic generative models, due to their inherent randomness, often show fluctuating performance across different metrics and rarely achieve consistently optimal results. Furthermore, precipitation nowcasting is typically evaluated using multiple metrics, some of which are inherently conflicting. For instance, there is often a trade-off between the Critical Success Index (CSI) and the False Alarm Ratio (FAR), making it challenging for existing models to deliver forecasts that perform well on both metrics simultaneously. To address these challenges, we introduce preference optimization into precipitation nowcasting for the first time, motivated by the success of reinforcement learning from human feedback in large language models. Specifically, we propose SynCast, a method that employs the two-stage post-training framework of Diffusion Sequential Preference Optimization (Diffusion-SPO), to progressively align conflicting metrics and consistently achieve superior performance. In the first stage, the framework focuses on reducing FAR, training the model to effectively suppress false alarms. Building on this foundation, the second stage further optimizes CSI with constraints that preserve FAR alignment, thereby achieving synergistic improvements across these conflicting metrics.
Related papers
- STLDM: Spatio-Temporal Latent Diffusion Model for Precipitation Nowcasting [22.29494904927957]
Precipitation nowcasting is a critical prediction task for society to prevent severe damage owing to extreme weather events.<n>We present a simple yet effective model architecture termed STLDM, a diffusion-based model that learns the latent representation from end to end alongside both the Varitemporal Autoencoder and the conditioning network.
arXiv Detail & Related papers (2025-12-24T11:34:44Z) - Forecasting Fails: Unveiling Evasion Attacks in Weather Prediction Models [60.728124907335]
This work introduces Weather Adaptive Adversarial Perturbation Optimization (WAAPO), a novel framework for generating targeted adversarial perturbations.<n>WAAPO achieves this by incorporating constraints for channel sparsity, spatial localization, and smoothness, ensuring that perturbations remain physically realistic and imperceptible.<n>Our experiments highlight critical vulnerabilities in AI-driven forecasting models, where small perturbations to initial conditions can result in significant deviations.
arXiv Detail & Related papers (2025-12-09T17:20:56Z) - RED-F: Reconstruction-Elimination based Dual-stream Contrastive Forecasting for Multivariate Time Series Anomaly Prediction [19.04414742117033]
We propose a reconstruction- Elimination based Dual-stream Contrastive Forecasting frame- work.<n>The framework transforms the difficult task of absolute signal detection into a simpler, more robust task of relative trajectory comparison.<n>Experiments on six real-world datasets demonstrate the superior capability of RED-F in anomaly prediction tasks.
arXiv Detail & Related papers (2025-11-25T08:11:41Z) - SimDiff: Simpler Yet Better Diffusion Model for Time Series Point Forecasting [8.141505251306622]
Diffusion models have recently shown promise in time series forecasting.<n>They often fail to achieve state-of-the-art point estimation performance.<n>We propose SimDiff, a single-stage, end-to-end framework for point estimation.
arXiv Detail & Related papers (2025-11-24T16:09:55Z) - ResAD: Normalized Residual Trajectory Modeling for End-to-End Autonomous Driving [64.42138266293202]
ResAD is a Normalized Residual Trajectory Modeling framework.<n>It reframes the learning task to predict the residual deviation from an inertial reference.<n>On the NAVSIM benchmark, ResAD achieves a state-of-the-art PDMS of 88.6 using a vanilla diffusion policy.
arXiv Detail & Related papers (2025-10-09T17:59:36Z) - Trustworthy Pedestrian Trajectory Prediction via Pattern-Aware Interaction Modeling [7.818805084618764]
InSyn is a novel Transformer-based model that captures diverse interaction patterns.<n>SSOS is a training strategy designed to alleviate the common issue of initial-step divergence in numerical time-series prediction.
arXiv Detail & Related papers (2025-07-16T12:23:04Z) - Bridging the Last Mile of Prediction: Enhancing Time Series Forecasting with Conditional Guided Flow Matching [9.465542901469815]
Conditional Guided Flow Matching (CGFM) is a model-agnostic framework that extends flow matching by integrating outputs from an auxiliary predictive model.<n>CGFM incorporates historical data as both conditions and guidance, uses two-sided conditional paths, and employs affine paths to expand the path space.<n> Experiments across datasets and baselines show CGFM consistently outperforms state-of-the-art models, advancing forecasting.
arXiv Detail & Related papers (2025-07-09T18:03:31Z) - Elucidated Rolling Diffusion Models for Probabilistic Weather Forecasting [52.6508222408558]
We introduce Elucidated Rolling Diffusion Models (ERDM)<n>ERDM is the first framework to unify a rolling forecast structure with the principled, performant design of Elucidated Diffusion Models (EDM)<n>On 2D Navier-Stokes simulations and ERA5 global weather forecasting at 1.5circ resolution, ERDM consistently outperforms key diffusion-based baselines.
arXiv Detail & Related papers (2025-06-24T21:44:31Z) - PostCast: Generalizable Postprocessing for Precipitation Nowcasting via Unsupervised Blurriness Modeling [85.56969895866243]
We propose an unsupervised postprocessing method to eliminate the blurriness without the requirement of training with the pairs of blurry predictions and corresponding ground truth.
A zero-shot blur kernel estimation mechanism and an auto-scale denoise guidance strategy are introduced to adapt the unconditional correlations to any blurriness modes.
arXiv Detail & Related papers (2024-10-08T08:38:23Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - Conditional Denoising Diffusion for Sequential Recommendation [62.127862728308045]
Two prominent generative models, Generative Adversarial Networks (GANs) and Variational AutoEncoders (VAEs)
GANs suffer from unstable optimization, while VAEs are prone to posterior collapse and over-smoothed generations.
We present a conditional denoising diffusion model, which includes a sequence encoder, a cross-attentive denoising decoder, and a step-wise diffuser.
arXiv Detail & Related papers (2023-04-22T15:32:59Z) - Mitigating Catastrophic Forgetting in Scheduled Sampling with Elastic
Weight Consolidation in Neural Machine Translation [15.581515781839656]
Autoregressive models trained with maximum likelihood estimation suffer from exposure bias.
We propose using Elastic Weight Consolidation as trade-off between mitigating exposure bias and retaining output quality.
Experiments on two IWSLT'14 translation tasks demonstrate that our approach alleviates catastrophic forgetting and significantly improves BLEU.
arXiv Detail & Related papers (2021-09-13T20:37:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.