Causal Ordering for Structure Learning From Time Series
- URL: http://arxiv.org/abs/2510.24639v1
- Date: Tue, 28 Oct 2025 17:06:15 GMT
- Title: Causal Ordering for Structure Learning From Time Series
- Authors: Pedro P. Sanchez, Damian Machlanski, Steven McDonagh, Sotirios A. Tsaftaris,
- Abstract summary: Causal discovery in time series is hindered by the complexity of identifying true causal relationships.<n>Traditional ordering methods inherently limit the representational capacity of the resulting model.<n>We propose DOTS, using diffusion-based causal discovery for temporal data.
- Score: 8.2018747411276
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Predicting causal structure from time series data is crucial for understanding complex phenomena in physiology, brain connectivity, climate dynamics, and socio-economic behaviour. Causal discovery in time series is hindered by the combinatorial complexity of identifying true causal relationships, especially as the number of variables and time points grow. A common approach to simplify the task is the so-called ordering-based methods. Traditional ordering methods inherently limit the representational capacity of the resulting model. In this work, we fix this issue by leveraging multiple valid causal orderings, instead of a single one as standard practice. We propose DOTS (Diffusion Ordered Temporal Structure), using diffusion-based causal discovery for temporal data. By integrating multiple orderings, DOTS effectively recovers the transitive closure of the underlying directed acyclic graph, mitigating spurious artifacts inherent in single-ordering approaches. We formalise the problem under standard assumptions such as stationarity and the additive noise model, and leverage score matching with diffusion processes to enable efficient Hessian estimation. Extensive experiments validate the approach. Empirical evaluations on synthetic and real-world datasets demonstrate that DOTS outperforms state-of-the-art baselines, offering a scalable and robust approach to temporal causal discovery. On synthetic benchmarks ($d{=}\!3-\!6$ variables, $T{=}200\!-\!5{,}000$ samples), DOTS improves mean window-graph $F1$ from $0.63$ (best baseline) to $0.81$. On the CausalTime real-world benchmark ($d{=}20\!-\!36$), while baselines remain the best on individual datasets, DOTS attains the highest average summary-graph $F1$ while halving runtime relative to graph-optimisation methods. These results establish DOTS as a scalable and accurate solution for temporal causal discovery.
Related papers
- Sparse Additive Model Pruning for Order-Based Causal Structure Learning [4.55744151522751]
Causal structure learning aims to estimate causal relationships between variables as a form of a causal directed acyclic graph (DAG) from observational data.<n>One of the major frameworks is the order-based approach that first estimates a topological order of the underlying DAG and then prunes spurious edges from the fully-connected DAG induced by the estimated topological order.<n>We introduce a new pruning method based on sparse additive models, which enables direct pruning of redundant edges without relying on hypothesis testing.
arXiv Detail & Related papers (2026-02-17T02:06:42Z) - Flow based approach for Dynamic Temporal Causal models with non-Gaussian or Heteroscedastic Noises [37.02662517645979]
We introduce FANTOM, a unified framework for causal discovery.<n>It handles non-stationary processes along with non-Gaussian and heteroscedastic noises.<n>It simultaneously infers the number of regimes and their corresponding indices and learns each regime's Directed Acyclic Graph.
arXiv Detail & Related papers (2025-06-20T15:12:43Z) - Retrieving Classes of Causal Orders with Inconsistent Knowledge Bases [0.8192907805418583]
Large Language Models (LLMs) have emerged as a promising alternative for extracting causal knowledge from text-based metadata.<n>LLMs tend to be unreliable and prone to hallucinations, necessitating strategies that account for their limitations.<n>We present a new method to derive a class of acyclic tournaments, which represent plausible causal orders.
arXiv Detail & Related papers (2024-12-18T16:37:51Z) - On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - SimAD: A Simple Dissimilarity-based Approach for Time Series Anomaly Detection [23.684577046512747]
We introduce a $textbfSim$ple dissimilarity-based approach for time series $textbfA$nomaly $textbfD$etection, referred to as $textbfSimAD$.<n>SimAD first incorporates a patching-based feature extractor capable of processing extended temporal windows and employs the EmbedPatch encoder to fully integrate normal behavioral patterns.<n>Second, we design an innovative ContrastFusion module in SimAD, which strengthens the robustness of anomaly detection by highlighting the distributional differences between normal and abnormal data.
arXiv Detail & Related papers (2024-05-18T09:37:04Z) - AcceleratedLiNGAM: Learning Causal DAGs at the speed of GPUs [57.12929098407975]
We show that by efficiently parallelizing existing causal discovery methods, we can scale them to thousands of dimensions.
Specifically, we focus on the causal ordering subprocedure in DirectLiNGAM and implement GPU kernels to accelerate it.
This allows us to apply DirectLiNGAM to causal inference on large-scale gene expression data with genetic interventions yielding competitive results.
arXiv Detail & Related papers (2024-03-06T15:06:11Z) - Causal Temporal Regime Structure Learning [49.77103348208835]
We present CASTOR, a novel method that concurrently learns the Directed Acyclic Graph (DAG) for each regime.<n>We establish the identifiability of the regimes and DAGs within our framework.<n>Experiments show that CASTOR consistently outperforms existing causal discovery models.
arXiv Detail & Related papers (2023-11-02T17:26:49Z) - Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative
Models [49.81937966106691]
We develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models.
In contrast to prior works, our theory is developed based on an elementary yet versatile non-asymptotic approach.
arXiv Detail & Related papers (2023-06-15T16:30:08Z) - A Scale-Invariant Sorting Criterion to Find a Causal Order in Additive
Noise Models [49.038420266408586]
We show that sorting variables by increasing variance often yields an ordering close to a causal order.
We propose an efficient baseline algorithm termed $R2$-SortnRegress that exploits high $R2$-sortability.
Our findings reveal high $R2$-sortability as an assumption about the data generating process relevant to causal discovery.
arXiv Detail & Related papers (2023-03-31T17:05:46Z) - Diffusion Models for Causal Discovery via Topological Ordering [20.875222263955045]
emphTopological ordering approaches reduce the optimisation space of causal discovery by searching over a permutation rather than graph space.
For ANMs, the emphHessian of the data log-likelihood can be used for finding leaf nodes in a causal graph, allowing its topological ordering.
We introduce theory for updating the learned Hessian without re-training the neural network, and we show that computing with a subset of samples gives an accurate approximation of the ordering.
arXiv Detail & Related papers (2022-10-12T13:36:29Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - An Analysis of the Adaptation Speed of Causal Models [80.77896315374747]
Recently, Bengio et al. conjectured that among all candidate models, $G$ is the fastest to adapt from one dataset to another.
We investigate the adaptation speed of cause-effect SCMs using convergence rates from optimization.
Surprisingly, we find situations where the anticausal model is advantaged, falsifying the initial hypothesis.
arXiv Detail & Related papers (2020-05-18T23:48:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.