High-recall causal discovery for autocorrelated time series with latent
confounders
- URL: http://arxiv.org/abs/2007.01884v3
- Date: Mon, 1 Feb 2021 19:00:08 GMT
- Title: High-recall causal discovery for autocorrelated time series with latent
confounders
- Authors: Andreas Gerhardus and Jakob Runge
- Abstract summary: We show that existing causal discovery methods such as FCI and variants suffer from low recall in the autocorrelated time series case.
We provide Python code for all methods involved in the simulation studies.
- Score: 12.995632804090198
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a new method for linear and nonlinear, lagged and contemporaneous
constraint-based causal discovery from observational time series in the
presence of latent confounders. We show that existing causal discovery methods
such as FCI and variants suffer from low recall in the autocorrelated time
series case and identify low effect size of conditional independence tests as
the main reason. Information-theoretical arguments show that effect size can
often be increased if causal parents are included in the conditioning sets. To
identify parents early on, we suggest an iterative procedure that utilizes
novel orientation rules to determine ancestral relationships already during the
edge removal phase. We prove that the method is order-independent, and sound
and complete in the oracle case. Extensive simulation studies for different
numbers of variables, time lags, sample sizes, and further cases demonstrate
that our method indeed achieves much higher recall than existing methods for
the case of autocorrelated continuous variables while keeping false positives
at the desired level. This performance gain grows with stronger
autocorrelation. At https://github.com/jakobrunge/tigramite we provide Python
code for all methods involved in the simulation studies.
Related papers
- On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - AcceleratedLiNGAM: Learning Causal DAGs at the speed of GPUs [57.12929098407975]
We show that by efficiently parallelizing existing causal discovery methods, we can scale them to thousands of dimensions.
Specifically, we focus on the causal ordering subprocedure in DirectLiNGAM and implement GPU kernels to accelerate it.
This allows us to apply DirectLiNGAM to causal inference on large-scale gene expression data with genetic interventions yielding competitive results.
arXiv Detail & Related papers (2024-03-06T15:06:11Z) - Causal Feature Selection via Transfer Entropy [59.999594949050596]
Causal discovery aims to identify causal relationships between features with observational data.
We introduce a new causal feature selection approach that relies on the forward and backward feature selection procedures.
We provide theoretical guarantees on the regression and classification errors for both the exact and the finite-sample cases.
arXiv Detail & Related papers (2023-10-17T08:04:45Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - Less is More: Mitigate Spurious Correlations for Open-Domain Dialogue
Response Generation Models by Causal Discovery [52.95935278819512]
We conduct the first study on spurious correlations for open-domain response generation models based on a corpus CGDIALOG curated in our work.
Inspired by causal discovery algorithms, we propose a novel model-agnostic method for training and inference of response generation model.
arXiv Detail & Related papers (2023-03-02T06:33:48Z) - CDANs: Temporal Causal Discovery from Autocorrelated and Non-Stationary
Time Series Data [5.130175508025212]
Causal discovery holds the potential to play a significant role in extracting actionable insights about human health.
We present a novel constraint-based causal discovery approach for autocorrelated and non-stationary time series data.
Our approach identifies lagged and instantaneous/contemporaneous causal relationships along with changing modules that vary over time.
arXiv Detail & Related papers (2023-02-07T04:13:48Z) - Sequential Kernelized Independence Testing [101.22966794822084]
We design sequential kernelized independence tests inspired by kernelized dependence measures.
We demonstrate the power of our approaches on both simulated and real data.
arXiv Detail & Related papers (2022-12-14T18:08:42Z) - Rhino: Deep Causal Temporal Relationship Learning With History-dependent
Noise [13.709618907099783]
We propose a novel causal relationship learning framework for time-series data, called Rhino.
Rhino combines vector auto-regression, deep learning and variational inference to model non-linear relationships with instantaneous effects.
Theoretically, we prove the structural identifiability of Rhino.
arXiv Detail & Related papers (2022-10-26T13:33:58Z) - Causal discovery under a confounder blanket [9.196779204457059]
Inferring causal relationships from observational data is rarely straightforward, but the problem is especially difficult in high dimensions.
We relax these assumptions and focus on an important but more specialized problem, namely recovering a directed acyclic subgraph.
We derive a complete algorithm for identifying causal relationships under these conditions and implement testing procedures.
arXiv Detail & Related papers (2022-05-11T18:10:45Z) - Causal Discovery from Conditionally Stationary Time Series [18.645887749731923]
State-Dependent Causal Inference (SDCI) is able to recover the underlying causal dependencies, provably with fully-observed states and empirically with hidden states.
improved results over non-causal RNNs on modeling NBA player movements demonstrate the potential of our method.
arXiv Detail & Related papers (2021-10-12T18:12:57Z) - Discovering contemporaneous and lagged causal relations in
autocorrelated nonlinear time series datasets [9.949781365631557]
The paper introduces a novel conditional independence (CI) based method for linear and nonlinear, lagged and contemporaneous causal discovery.
Existing CI-based methods suffer from low recall and partially inflated false positives for strong autocorrelation.
The novel method, PCMCI$+$, extends PCMCI to include discovery of contemporaneous links.
arXiv Detail & Related papers (2020-03-07T23:33:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.