Deep Learning-based Group Causal Inference in Multivariate Time-series
- URL: http://arxiv.org/abs/2401.08386v1
- Date: Tue, 16 Jan 2024 14:19:28 GMT
- Title: Deep Learning-based Group Causal Inference in Multivariate Time-series
- Authors: Wasim Ahmad, Maha Shadaydeh, Joachim Denzler
- Abstract summary: Causal inference in a nonlinear system of multivariate timeseries is instrumental in disentangling the intricate web of relationships among variables.
In this work, we test model invariance by group-level interventions on the trained deep networks to infer causal direction in groups of variables.
- Score: 8.073449277052495
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Causal inference in a nonlinear system of multivariate timeseries is
instrumental in disentangling the intricate web of relationships among
variables, enabling us to make more accurate predictions and gain deeper
insights into real-world complex systems. Causality methods typically identify
the causal structure of a multivariate system by considering the cause-effect
relationship of each pair of variables while ignoring the collective effect of
a group of variables or interactions involving more than two-time series
variables. In this work, we test model invariance by group-level interventions
on the trained deep networks to infer causal direction in groups of variables,
such as climate and ecosystem, brain networks, etc. Extensive testing with
synthetic and real-world time series data shows a significant improvement of
our method over other applied group causality methods and provides us insights
into real-world time series. The code for our method can be found
at:https://github.com/wasimahmadpk/gCause.
Related papers
- Targeted Cause Discovery with Data-Driven Learning [66.86881771339145]
We propose a novel machine learning approach for inferring causal variables of a target variable from observations.
We employ a neural network trained to identify causality through supervised learning on simulated data.
Empirical results demonstrate the effectiveness of our method in identifying causal relationships within large-scale gene regulatory networks.
arXiv Detail & Related papers (2024-08-29T02:21:11Z) - Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - Kernel-based Joint Independence Tests for Multivariate Stationary and
Non-stationary Time Series [0.6749750044497732]
We introduce kernel-based statistical tests of joint independence in multivariate time series.
We show how the method robustly uncovers significant higher-order dependencies in synthetic examples.
Our method can aid in uncovering high-order interactions in data.
arXiv Detail & Related papers (2023-05-15T10:38:24Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Causality-Based Multivariate Time Series Anomaly Detection [63.799474860969156]
We formulate the anomaly detection problem from a causal perspective and view anomalies as instances that do not follow the regular causal mechanism to generate the multivariate data.
We then propose a causality-based anomaly detection approach, which first learns the causal structure from data and then infers whether an instance is an anomaly relative to the local causal mechanism.
We evaluate our approach with both simulated and public datasets as well as a case study on real-world AIOps applications.
arXiv Detail & Related papers (2022-06-30T06:00:13Z) - Equivariance Allows Handling Multiple Nuisance Variables When Analyzing
Pooled Neuroimaging Datasets [53.34152466646884]
In this paper, we show how bringing recent results on equivariant representation learning instantiated on structured spaces together with simple use of classical results on causal inference provides an effective practical solution.
We demonstrate how our model allows dealing with more than one nuisance variable under some assumptions and can enable analysis of pooled scientific datasets in scenarios that would otherwise entail removing a large portion of the samples.
arXiv Detail & Related papers (2022-03-29T04:54:06Z) - Causal Inference in Non-linear Time-series using Deep Networks and
Knockoff Counterfactuals [8.56007054019834]
Non-linear coupling of variables is one of the major challenges inaccurate estimation of cause-effect relations.
We propose to use deep autoregressive networks (DeepAR) in tandem with counterfactual analysis to infer nonlinear causal relations.
arXiv Detail & Related papers (2021-09-22T16:07:27Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Explainable Multivariate Time Series Classification: A Deep Neural
Network Which Learns To Attend To Important Variables As Well As Informative
Time Intervals [32.30627405832656]
Time series data is prevalent in a wide variety of real-world applications.
Key criterion to understand such predictive models involves elucidating and quantifying the contribution of time varying input variables to the classification.
We introduce a novel, modular, convolution-based feature extraction and attention mechanism that simultaneously identifies the variables as well as time intervals which determine the classification output.
arXiv Detail & Related papers (2020-11-23T19:16:46Z) - LAVARNET: Neural Network Modeling of Causal Variable Relationships for
Multivariate Time Series Forecasting [18.89688469820947]
A novel neural network-based architecture is proposed, termed LAgged VAriable NETwork.
It intrinsically estimates the importance of latent lagged variables and combines high dimensional representations of them to predict future values time series.
Our model is compared with other baseline and state of the art neural network architectures on one simulated data set and four real data sets from meteorology, music, solar activity, finance areas.
arXiv Detail & Related papers (2020-09-02T10:57:28Z) - Pay Attention to Evolution: Time Series Forecasting with Deep
Graph-Evolution Learning [33.79957892029931]
This work presents a novel neural network architecture for time-series forecasting.
We named our method Recurrent Graph Evolution Neural Network (ReGENN)
An extensive set of experiments was conducted comparing ReGENN with dozens of ensemble methods and classical statistical ones.
arXiv Detail & Related papers (2020-08-28T20:10:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.