Semantically-Guided Inference for Conditional Diffusion Models: Enhancing Covariate Consistency in Time Series Forecasting
- URL: http://arxiv.org/abs/2508.01761v1
- Date: Sun, 03 Aug 2025 14:04:04 GMT
- Title: Semantically-Guided Inference for Conditional Diffusion Models: Enhancing Covariate Consistency in Time Series Forecasting
- Authors: Rui Ding, Hanyang Meng, Zeyang Zhang, Jielong Yang,
- Abstract summary: SemGuide is a plug-and-play, inference-time method that enhances covariate consistency in conditional diffusion models.<n>Our approach introduces a scoring network to assess the semantic alignment between intermediate diffusion states and future covariates.
- Score: 6.716179859091235
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion models have demonstrated strong performance in time series forecasting, yet often suffer from semantic misalignment between generated trajectories and conditioning covariates, especially under complex or multimodal conditions. To address this issue, we propose SemGuide, a plug-and-play, inference-time method that enhances covariate consistency in conditional diffusion models. Our approach introduces a scoring network to assess the semantic alignment between intermediate diffusion states and future covariates. These scores serve as proxy likelihoods in a stepwise importance reweighting procedure, which progressively adjusts the sampling path without altering the original training process. The method is model-agnostic and compatible with any conditional diffusion framework. Experiments on real-world forecasting tasks show consistent gains in both predictive accuracy and covariate alignment, with especially strong performance under complex conditioning scenarios.
Related papers
- Unified Flow Matching for Long Horizon Event Forecasting [3.0639815065447036]
We propose a unified flow matching framework for marked temporal point processes.<n>By learning continuous-time flows for both components, our method generates coherent long horizon event trajectories without sequential decoding.<n>We evaluate our model on six real-world benchmarks and demonstrate significant improvements over autoregressive and diffusion-based baselines in both accuracy and generation efficiency.
arXiv Detail & Related papers (2025-08-06T19:42:49Z) - Inference-Time Scaling of Diffusion Language Models with Particle Gibbs Sampling [62.640128548633946]
We introduce a novel inference-time scaling approach based on particle Gibbs sampling for discrete diffusion models.<n>Our method consistently outperforms prior inference-time strategies on reward-guided text generation tasks.
arXiv Detail & Related papers (2025-07-11T08:00:47Z) - Consistent World Models via Foresight Diffusion [56.45012929930605]
We argue that a key bottleneck in learning consistent diffusion-based world models lies in the suboptimal predictive ability.<n>We propose Foresight Diffusion (ForeDiff), a diffusion-based world modeling framework that enhances consistency by decoupling condition understanding from target denoising.
arXiv Detail & Related papers (2025-05-22T10:01:59Z) - Probabilistic Forecasting via Autoregressive Flow Matching [1.5467259918426441]
FlowTime is a generative model for probabilistic forecasting of timeseries data.<n>We decompose the joint distribution of future observations into a sequence of conditional densities, each modeled via a shared flow.<n>We demonstrate the effectiveness of FlowTime on multiple dynamical systems and real-world forecasting tasks.
arXiv Detail & Related papers (2025-03-13T13:54:24Z) - Dynamical Diffusion: Learning Temporal Dynamics with Diffusion Models [71.63194926457119]
We introduce Dynamical Diffusion (DyDiff), a theoretically sound framework that incorporates temporally aware forward and reverse processes.<n>Experiments across scientifictemporal forecasting, video prediction, and time series forecasting demonstrate that Dynamical Diffusion consistently improves performance in temporal predictive tasks.
arXiv Detail & Related papers (2025-03-02T16:10:32Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Continuous Ensemble Weather Forecasting with Diffusion models [10.730406954385927]
Continuous Ensemble Forecasting is a novel and flexible method for sampling ensemble forecasts in diffusion models.<n>It can generate temporally consistent ensemble trajectories completely in parallel, with no autoregressive steps.<n>We demonstrate that the method achieves competitive results for global weather forecasting with good probabilistic properties.
arXiv Detail & Related papers (2024-10-07T18:51:23Z) - Channel-aware Contrastive Conditional Diffusion for Multivariate Probabilistic Time Series Forecasting [19.383395337330082]
We propose a generic channel-aware Contrastive Conditional Diffusion model entitled CCDM.
The proposed CCDM can exhibit superior forecasting capability compared to current state-of-the-art diffusion forecasters.
arXiv Detail & Related papers (2024-10-03T03:13:15Z) - Time-series Generation by Contrastive Imitation [87.51882102248395]
We study a generative framework that seeks to combine the strengths of both: Motivated by a moment-matching objective to mitigate compounding error, we optimize a local (but forward-looking) transition policy.
At inference, the learned policy serves as the generator for iterative sampling, and the learned energy serves as a trajectory-level measure for evaluating sample quality.
arXiv Detail & Related papers (2023-11-02T16:45:25Z) - Sequential Bayesian Neural Subnetwork Ensembles [4.6354120722975125]
We propose an approach for sequential ensembling of dynamic Bayesian neuralworks that consistently maintains reduced model complexity throughout the training process.
Our proposed approach outperforms traditional dense and sparse deterministic and Bayesian ensemble models in terms of prediction accuracy, uncertainty estimation, out-of-distribution detection, and adversarial robustness.
arXiv Detail & Related papers (2022-06-01T22:57:52Z) - Composing Normalizing Flows for Inverse Problems [89.06155049265641]
We propose a framework for approximate inference that estimates the target conditional as a composition of two flow models.
Our method is evaluated on a variety of inverse problems and is shown to produce high-quality samples with uncertainty.
arXiv Detail & Related papers (2020-02-26T19:01:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.