Causal Time Series Generation via Diffusion Models
- URL: http://arxiv.org/abs/2509.20846v1
- Date: Thu, 25 Sep 2025 07:34:46 GMT
- Title: Causal Time Series Generation via Diffusion Models
- Authors: Yutong Xia, Chang Xu, Yuxuan Liang, Qingsong Wen, Roger Zimmermann, Jiang Bian,
- Abstract summary: We introduce causal time series generation as a new TSG task family, formalized within Pearl's causal ladder.<n>To instantiate these tasks, we develop CaTSG, a unified diffusion-based framework.<n>Experiments on both synthetic and real-world datasets show that CaTSG achieves superior fidelity.
- Score: 96.95879410279089
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series generation (TSG) synthesizes realistic sequences and has achieved remarkable success. Among TSG, conditional models generate sequences given observed covariates, however, such models learn observational correlations without considering unobserved confounding. In this work, we propose a causal perspective on conditional TSG and introduce causal time series generation as a new TSG task family, formalized within Pearl's causal ladder, extending beyond observational generation to include interventional and counterfactual settings. To instantiate these tasks, we develop CaTSG, a unified diffusion-based framework with backdoor-adjusted guidance that causally steers sampling toward desired interventions and individual counterfactuals while preserving observational fidelity. Specifically, our method derives causal score functions via backdoor adjustment and the abduction-action-prediction procedure, thus enabling principled support for all three levels of TSG. Extensive experiments on both synthetic and real-world datasets show that CaTSG achieves superior fidelity and also supporting interventional and counterfactual generation that existing baselines cannot handle. Overall, we propose the causal TSG family and instantiate it with CaTSG, providing an initial proof-of-concept and opening a promising direction toward more reliable simulation under interventions and counterfactual generation.
Related papers
- Scalable Contrastive Causal Discovery under Unknown Soft Interventions [3.165716101116899]
We propose a scalable causal discovery model for paired observational and interventional settings with shared underlying causal structure and unknown soft interventions.<n> Experiments on synthetic data demonstrate improved causal structure recovery, generalization to unseen graphs with held-out causal mechanisms, and scalability to larger graphs.
arXiv Detail & Related papers (2026-03-03T18:16:16Z) - GTS: Inference-Time Scaling of Latent Reasoning with a Learnable Gaussian Thought Sampler [54.10960908347221]
We model latent thought exploration as conditional sampling from learnable densities and instantiate this idea as a Gaussian Thought Sampler (GTS)<n>GTS predicts context-dependent perturbation distributions over continuous reasoning states and is trained with GRPO-style policy optimization while keeping the backbone frozen.
arXiv Detail & Related papers (2026-02-15T09:57:47Z) - CoG: Controllable Graph Reasoning via Relational Blueprints and Failure-Aware Refinement over Knowledge Graphs [53.199517625701475]
CoG is a training-free framework inspired by Dual-Process Theory that mimics the interplay between intuition and deliberation.<n>CoG significantly outperforms state-of-the-art approaches in both accuracy and efficiency.
arXiv Detail & Related papers (2026-01-16T07:27:40Z) - Context-aware gate set tomography: Improving the self-consistent characterization of trapped-ion universal gate sets by leveraging non-Markovianity [45.88028371034407]
Gate set tomography ( GST) estimates the complete set of noisy quantum gates, state preparations, and measurements.<n>In its original incarnation, GST improves the estimation precision by applying the gates sequentially.<n>We show that context dependence can be incorporated in the parametrization of the gate set, allowing us to reduce the sampling cost of GST.
arXiv Detail & Related papers (2025-07-03T11:37:36Z) - CCD: Continual Consistency Diffusion for Lifelong Generative Modeling [29.568682321463886]
Continual Diffusion Generation (CDG) is a structured pipeline that redefines how diffusion models are implemented under continual learning.<n>We propose the first theoretical foundation for CDG, grounded in a cross-task analysis of diffusion-specific generative dynamics.<n>We show that CCD achieves SOTA performance across various benchmarks, especially improving generative metrics in overlapping-task scenarios.
arXiv Detail & Related papers (2025-05-17T09:49:25Z) - Spatial Reasoning with Denoising Models [49.83744014336816]
We introduce a framework to perform reasoning over sets of continuous variables via denoising generative models.<n>For the first time, that order of generation can successfully be predicted by the denoising network itself.<n>Using these findings, we can increase the accuracy of specific reasoning tasks from 1% to >50%.
arXiv Detail & Related papers (2025-02-28T14:08:30Z) - A Fixed-Point Approach for Causal Generative Modeling [20.88890689294816]
We propose a novel formalism for describing Structural Causal Models (SCMs) as fixed-point problems on causally ordered variables.<n>We establish the weakest known conditions for their unique recovery given the topological ordering (TO)
arXiv Detail & Related papers (2024-04-10T12:29:05Z) - Causal Temporal Regime Structure Learning [49.77103348208835]
We present CASTOR, a novel method that concurrently learns the Directed Acyclic Graph (DAG) for each regime.<n>We establish the identifiability of the regimes and DAGs within our framework.<n>Experiments show that CASTOR consistently outperforms existing causal discovery models.
arXiv Detail & Related papers (2023-11-02T17:26:49Z) - From Gradient Flow on Population Loss to Learning with Stochastic
Gradient Descent [50.4531316289086]
Gradient Descent (SGD) has been the method of choice for learning large-scale non-root models.
An overarching paper is providing general conditions SGD converges, assuming that GF on the population loss converges.
We provide a unified analysis for GD/SGD not only for classical settings like convex losses, but also for more complex problems including Retrieval Matrix sq-root.
arXiv Detail & Related papers (2022-10-13T03:55:04Z) - An Empirical Study: Extensive Deep Temporal Point Process [61.14164208094238]
We first review recent research emphasis and difficulties in modeling asynchronous event sequences with deep temporal point process.<n>We propose a Granger causality discovery framework for exploiting the relations among multi-types of events.
arXiv Detail & Related papers (2021-10-19T10:15:00Z) - Causal Discovery from Conditionally Stationary Time Series [14.297325665581353]
We develop a causal discovery approach to handle a wide class of nonstationary time series.<n>Named State-Dependent Causal Inference (SDCI), our approach is able to recover the underlying causal dependencies.<n> Empirical experiments on nonlinear particle interaction data and gene regulatory networks demonstrate SDCI's superior performance.
arXiv Detail & Related papers (2021-10-12T18:12:57Z) - Causal Graph Discovery from Self and Mutually Exciting Time Series [12.802653884445132]
We develop a non-asymptotic recovery guarantee and quantifiable uncertainty by solving a linear program.
We demonstrate the effectiveness of our approach in recovering highly interpretable causal DAGs over Sepsis Associated Derangements (SADs)
arXiv Detail & Related papers (2021-06-04T16:59:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.