ReTimeCausal: EM-Augmented Additive Noise Models for Interpretable Causal Discovery in Irregular Time Series
- URL: http://arxiv.org/abs/2507.03310v1
- Date: Fri, 04 Jul 2025 05:39:50 GMT
- Title: ReTimeCausal: EM-Augmented Additive Noise Models for Interpretable Causal Discovery in Irregular Time Series
- Authors: Weihong Li, Anpeng Wu, Kun Kuang, Keting Yin,
- Abstract summary: This paper studies causal discovery in irregularly sampled time series in high-stakes domains like finance, healthcare, and climate science.<n>We propose ReTimeCausal, a novel integration of Additive Noise Models (ANM) and Expectation-Maximization (EM) that unifies physics-guided data imputation with sparse causal inference.
- Score: 32.21736212737614
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies causal discovery in irregularly sampled time series-a pivotal challenge in high-stakes domains like finance, healthcare, and climate science, where missing data and inconsistent sampling frequencies distort causal mechanisms. Traditional methods (e.g., Granger causality, PCMCI) fail to reconcile multi-scale interactions (e.g., hourly storms vs. decadal climate shifts), while neural approaches (e.g., CUTS+) lack interpretability, stemming from a critical gap: existing frameworks either rigidly assume temporal regularity or aggregate dynamics into opaque representations, neglecting real-world granularity and auditable logic. To bridge this gap, we propose ReTimeCausal, a novel integration of Additive Noise Models (ANM) and Expectation-Maximization (EM) that unifies physics-guided data imputation with sparse causal inference. Through kernelized sparse regression and structural constraints, ReTimeCausal iteratively refines missing values (E-step) and causal graphs (M-step), resolving cross-frequency dependencies and missing data issues. Extensive experiments on synthetic and real-world datasets demonstrate that ReTimeCausal outperforms existing state-of-the-art methods under challenging irregular sampling and missing data conditions.
Related papers
- Flow-Based Non-stationary Temporal Regime Causal Structure Learning [49.77103348208835]
We introduce FANTOM, a unified framework for causal discovery.<n>It handles non stationary processes along with non Gaussian and heteroscedastic noises.<n>It simultaneously infers the number of regimes and their corresponding indices and learns each regime's Directed Acyclic Graph.
arXiv Detail & Related papers (2025-06-20T15:12:43Z) - Temporal Causal-based Simulation for Realistic Time-series Generation [1.49201581313345]
Causal Discovery plays a pivotal role in revealing relationships among observed variables, particularly in the temporal setup.<n>Generation techniques depending on simplified assumptions on causal structure, effects and time, limit the quality and diversity of the simulated data.<n>We introduce Temporal Causal-based Simulation (TCS), a robust framework for generating realistic time-series data and their associated temporal causal graphs.
arXiv Detail & Related papers (2025-06-02T10:59:48Z) - TimeGraph: Synthetic Benchmark Datasets for Robust Time-Series Causal Discovery [4.07304559469381]
We introduce TimeGraph, a comprehensive suite of synthetic time-series benchmark datasets.<n>Each dataset is accompanied by a fully specified causal graph featuring varying densities and diverse noise distributions.<n>We demonstrate the utility of TimeGraph through systematic evaluations of state-of-the-art causal discovery algorithms.
arXiv Detail & Related papers (2025-06-02T06:34:11Z) - Causal Discovery from Time-Series Data with Short-Term Invariance-Based Convolutional Neural Networks [12.784885649573994]
Causal discovery from time-series data aims to capture both intra-slice (contemporaneous) and inter-slice (time-lagged) causality.
We propose a novel gradient-based causal discovery approach STIC, which focuses on textbfShort-textbfTerm textbfInvariance using textbfConvolutional neural networks.
arXiv Detail & Related papers (2024-08-15T08:43:28Z) - Causal Inference from Slowly Varying Nonstationary Processes [2.3072402651280517]
Causal inference from observational data hinges on asymmetry between cause and effect from the data generating mechanisms.
We propose a new class of restricted structural causal models, via a time-varying filter and stationary noise, and exploit the asymmetry from nonstationarity for causal identification.
arXiv Detail & Related papers (2024-05-11T04:15:47Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Causal Temporal Regime Structure Learning [49.77103348208835]
We present CASTOR, a novel method that concurrently learns the Directed Acyclic Graph (DAG) for each regime.<n>We establish the identifiability of the regimes and DAGs within our framework.<n>Experiments show that CASTOR consistently outperforms existing causal discovery models.
arXiv Detail & Related papers (2023-11-02T17:26:49Z) - Continuous-Time Modeling of Counterfactual Outcomes Using Neural
Controlled Differential Equations [84.42837346400151]
Estimating counterfactual outcomes over time has the potential to unlock personalized healthcare.
Existing causal inference approaches consider regular, discrete-time intervals between observations and treatment decisions.
We propose a controllable simulation environment based on a model of tumor growth for a range of scenarios.
arXiv Detail & Related papers (2022-06-16T17:15:15Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.