Causal structure learning from time series: Large regression
coefficients may predict causal links better in practice than small p-values
- URL: http://arxiv.org/abs/2002.09573v2
- Date: Wed, 2 Sep 2020 09:24:53 GMT
- Title: Causal structure learning from time series: Large regression
coefficients may predict causal links better in practice than small p-values
- Authors: Sebastian Weichwald, Martin E Jakobsen, Phillip B Mogensen, Lasse
Petersen, Nikolaj Thams, Gherardo Varando
- Abstract summary: We describe the algorithms for causal structure learning from time series data that won the Causality 4 Climate competition.
We examine how our combination of established ideas achieves competitive performance on semi-realistic and realistic time series data.
- Score: 4.014393692461288
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this article, we describe the algorithms for causal structure learning
from time series data that won the Causality 4 Climate competition at the
Conference on Neural Information Processing Systems 2019 (NeurIPS). We examine
how our combination of established ideas achieves competitive performance on
semi-realistic and realistic time series data exhibiting common challenges in
real-world Earth sciences data. In particular, we discuss a) a rationale for
leveraging linear methods to identify causal links in non-linear systems, b) a
simulation-backed explanation as to why large regression coefficients may
predict causal links better in practice than small p-values and thus why
normalising the data may sometimes hinder causal structure learning.
For benchmark usage, we detail the algorithms here and provide
implementations at https://github.com/sweichwald/tidybench . We propose the
presented competition-proven methods for baseline benchmark comparisons to
guide the development of novel algorithms for structure learning from time
series.
Related papers
- TS-CausalNN: Learning Temporal Causal Relations from Non-linear Non-stationary Time Series Data [0.42156176975445486]
We propose a Time-Series Causal Neural Network (TS-CausalNN) to discover contemporaneous and lagged causal relations simultaneously.
In addition to the simple parallel design, an advantage of the proposed model is that it naturally handles the non-stationarity and non-linearity of the data.
arXiv Detail & Related papers (2024-04-01T20:33:29Z) - Multi-modal Causal Structure Learning and Root Cause Analysis [67.67578590390907]
We propose Mulan, a unified multi-modal causal structure learning method for root cause localization.
We leverage a log-tailored language model to facilitate log representation learning, converting log sequences into time-series data.
We also introduce a novel key performance indicator-aware attention mechanism for assessing modality reliability and co-learning a final causal graph.
arXiv Detail & Related papers (2024-02-04T05:50:38Z) - Deep Ensembles Meets Quantile Regression: Uncertainty-aware Imputation
for Time Series [49.992908221544624]
Time series data often exhibit numerous missing values, which is the time series imputation task.
Previous deep learning methods have been shown to be effective for time series imputation.
We propose a non-generative time series imputation method that produces accurate imputations with inherent uncertainty.
arXiv Detail & Related papers (2023-12-03T05:52:30Z) - CausalTime: Realistically Generated Time-series for Benchmarking of
Causal Discovery [14.092834149864514]
This study introduces the CausalTime pipeline to generate time-series that highly resemble the real data.
The pipeline starts from real observations in a specific scenario and produces a matching benchmark dataset.
In the experiments, we validate the fidelity of the generated data through qualitative and quantitative experiments, followed by a benchmarking of existing TSCD algorithms.
arXiv Detail & Related papers (2023-10-03T02:29:19Z) - CUTS: Neural Causal Discovery from Irregular Time-Series Data [27.06531262632836]
Causal discovery from time-series data has been a central task in machine learning.
We present CUTS, a neural Granger causal discovery algorithm to jointly impute unobserved data points and build causal graphs.
Our approach constitutes a promising step towards applying causal discovery to real applications with non-ideal observations.
arXiv Detail & Related papers (2023-02-15T04:16:34Z) - NODAGS-Flow: Nonlinear Cyclic Causal Structure Learning [8.20217860574125]
We propose a novel framework for learning nonlinear cyclic causal models from interventional data, called NODAGS-Flow.
We show significant performance improvements with our approach compared to state-of-the-art methods with respect to structure recovery and predictive performance.
arXiv Detail & Related papers (2023-01-04T23:28:18Z) - Amortized Inference for Causal Structure Learning [72.84105256353801]
Learning causal structure poses a search problem that typically involves evaluating structures using a score or independence test.
We train a variational inference model to predict the causal structure from observational/interventional data.
Our models exhibit robust generalization capabilities under substantial distribution shift.
arXiv Detail & Related papers (2022-05-25T17:37:08Z) - Learning ODE Models with Qualitative Structure Using Gaussian Processes [0.6882042556551611]
In many contexts explicit data collection is expensive and learning algorithms must be data-efficient to be feasible.
We propose an approach to learning a vector field of differential equations using sparse Gaussian Processes.
We show that this combination improves extrapolation performance and long-term behaviour significantly, while also reducing the computational cost.
arXiv Detail & Related papers (2020-11-10T19:34:07Z) - Network Classifiers Based on Social Learning [71.86764107527812]
We propose a new way of combining independently trained classifiers over space and time.
The proposed architecture is able to improve prediction performance over time with unlabeled data.
We show that this strategy results in consistent learning with high probability, and it yields a robust structure against poorly trained classifiers.
arXiv Detail & Related papers (2020-10-23T11:18:20Z) - A Constraint-Based Algorithm for the Structural Learning of
Continuous-Time Bayesian Networks [70.88503833248159]
We propose the first constraint-based algorithm for learning the structure of continuous-time Bayesian networks.
We discuss the different statistical tests and the underlying hypotheses used by our proposal to establish conditional independence.
arXiv Detail & Related papers (2020-07-07T07:34:09Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.