Causal Inference in Non-linear Time-series using Deep Networks and
Knockoff Counterfactuals
- URL: http://arxiv.org/abs/2109.10817v2
- Date: Thu, 23 Sep 2021 14:19:40 GMT
- Title: Causal Inference in Non-linear Time-series using Deep Networks and
Knockoff Counterfactuals
- Authors: Wasim Ahmad, Maha Shadaydeh, Joachim Denzler
- Abstract summary: Non-linear coupling of variables is one of the major challenges inaccurate estimation of cause-effect relations.
We propose to use deep autoregressive networks (DeepAR) in tandem with counterfactual analysis to infer nonlinear causal relations.
- Score: 8.56007054019834
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Estimating causal relations is vital in understanding the complex
interactions in multivariate time series. Non-linear coupling of variables is
one of the major challenges inaccurate estimation of cause-effect relations. In
this paper, we propose to use deep autoregressive networks (DeepAR) in tandem
with counterfactual analysis to infer nonlinear causal relations in
multivariate time series. We extend the concept of Granger causality using
probabilistic forecasting with DeepAR. Since deep networks can neither handle
missing input nor out-of-distribution intervention, we propose to use the
Knockoffs framework (Barberand Cand`es, 2015) for generating intervention
variables and consequently counterfactual probabilistic forecasting. Knockoff
samples are independent of their output given the observed variables and
exchangeable with their counterpart variables without changing the underlying
distribution of the data. We test our method on synthetic as well as real-world
time series datasets. Overall our method outperforms the widely used vector
autoregressive Granger causality and PCMCI in detecting nonlinear causal
dependency in multivariate time series.
Related papers
- Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Multivariate Probabilistic Time Series Forecasting with Correlated Errors [17.212396544233307]
We present a plug-and-play method that learns the covariance structure of errors over multiple steps for autoregressive models with Gaussian-distributed errors.
The learned covariance matrix can be used to calibrate predictions based on observed residuals.
arXiv Detail & Related papers (2024-02-01T20:27:19Z) - Deep Learning-based Group Causal Inference in Multivariate Time-series [8.073449277052495]
Causal inference in a nonlinear system of multivariate timeseries is instrumental in disentangling the intricate web of relationships among variables.
In this work, we test model invariance by group-level interventions on the trained deep networks to infer causal direction in groups of variables.
arXiv Detail & Related papers (2024-01-16T14:19:28Z) - Discovering Predictable Latent Factors for Time Series Forecasting [39.08011991308137]
We develop a novel framework for inferring the intrinsic latent factors implied by the observable time series.
We introduce three characteristics, i.e., predictability, sufficiency, and identifiability, and model these characteristics via the powerful deep latent dynamics models.
Empirical results on multiple real datasets show the efficiency of our method for different kinds of time series forecasting.
arXiv Detail & Related papers (2023-03-18T14:37:37Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Causal Discovery using Model Invariance through Knockoff Interventions [8.330791157878137]
We model nonlinear interactions in time series using DeepAR.
We expose the model to different environments using Knockoffs-based interventions.
We show that the distribution of the response residual does not change significantly upon interventions on non-causal predictors.
arXiv Detail & Related papers (2022-07-08T14:46:47Z) - Deep Recurrent Modelling of Granger Causality with Latent Confounding [0.0]
We propose a deep learning-based approach to model non-linear Granger causality by directly accounting for latent confounders.
We demonstrate the model performance on non-linear time series for which the latent confounder influences the cause and effect with different time lags.
arXiv Detail & Related papers (2022-02-23T03:26:22Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Latent Causal Invariant Model [128.7508609492542]
Current supervised learning can learn spurious correlation during the data-fitting process.
We propose a Latent Causal Invariance Model (LaCIM) which pursues causal prediction.
arXiv Detail & Related papers (2020-11-04T10:00:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.