DecoR: Deconfounding Time Series with Robust Regression
- URL: http://arxiv.org/abs/2406.07005v1
- Date: Tue, 11 Jun 2024 06:59:17 GMT
- Title: DecoR: Deconfounding Time Series with Robust Regression
- Authors: Felix Schur, Jonas Peters,
- Abstract summary: This work focuses on estimating the causal effect between two time series, which are confounded by a third, unobserved time series.
We introduce Deconfounding by Robust regression (DecoR), a novel approach that estimates the causal effect using robust linear regression in the frequency domain.
- Score: 8.02119748947076
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Causal inference on time series data is a challenging problem, especially in the presence of unobserved confounders. This work focuses on estimating the causal effect between two time series, which are confounded by a third, unobserved time series. Assuming spectral sparsity of the confounder, we show how in the frequency domain this problem can be framed as an adversarial outlier problem. We introduce Deconfounding by Robust regression (DecoR), a novel approach that estimates the causal effect using robust linear regression in the frequency domain. Considering two different robust regression techniques, we first improve existing bounds on the estimation error for such techniques. Crucially, our results do not require distributional assumptions on the covariates. We can therefore use them in time series settings. Applying these results to DecoR, we prove, under suitable assumptions, upper bounds for the estimation error of DecoR that imply consistency. We show DecoR's effectiveness through experiments on synthetic data. Our experiments furthermore suggest that our method is robust with respect to model misspecification.
Related papers
- Risk and cross validation in ridge regression with correlated samples [72.59731158970894]
We provide training examples for the in- and out-of-sample risks of ridge regression when the data points have arbitrary correlations.
We further extend our analysis to the case where the test point has non-trivial correlations with the training set, setting often encountered in time series forecasting.
We validate our theory across a variety of high dimensional data.
arXiv Detail & Related papers (2024-08-08T17:27:29Z) - A Study of Posterior Stability for Time-Series Latent Diffusion [59.41969496514184]
We first show that posterior collapse will reduce latent diffusion to a variational autoencoder (VAE), making it less expressive.
We then introduce a principled method: dependency measure, that quantifies the sensitivity of a recurrent decoder to input variables.
Building on our theoretical and empirical studies, we introduce a new framework that extends latent diffusion and has a stable posterior.
arXiv Detail & Related papers (2024-05-22T21:54:12Z) - Doubly Robust Proximal Causal Learning for Continuous Treatments [56.05592840537398]
We propose a kernel-based doubly robust causal learning estimator for continuous treatments.
We show that its oracle form is a consistent approximation of the influence function.
We then provide a comprehensive convergence analysis in terms of the mean square error.
arXiv Detail & Related papers (2023-09-22T12:18:53Z) - Explainable Parallel RCNN with Novel Feature Representation for Time
Series Forecasting [0.0]
Time series forecasting is a fundamental challenge in data science.
We develop a parallel deep learning framework composed of RNN and CNN.
Extensive experiments on three datasets reveal the effectiveness of our method.
arXiv Detail & Related papers (2023-05-08T17:20:13Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - AER: Auto-Encoder with Regression for Time Series Anomaly Detection [12.418290128163882]
Anomaly detection on time series data is increasingly common across various industrial domains.
Recent unsupervised machine learning methods have made remarkable progress in tackling this problem.
We propose AER (Auto-encoder with Regression), a joint model that combines a vanilla auto-encoder and an LSTM regressor.
arXiv Detail & Related papers (2022-12-27T17:22:21Z) - Sequential Causal Effect Variational Autoencoder: Time Series Causal
Link Estimation under Hidden Confounding [8.330791157878137]
Estimating causal effects from observational data sometimes leads to spurious relationships which can be misconceived as causal.
We propose Sequential Causal Effect Variational Autoencoder (SCEVAE), a novel method for time series causality analysis under hidden confounding.
arXiv Detail & Related papers (2022-09-23T09:43:58Z) - Identifying Causal Effects using Instrumental Time Series: Nuisance IV and Correcting for the Past [12.49477539101379]
We consider IV regression in time series models, such as vector auto-regressive ( VAR) processes.
Direct applications of i.i.d. techniques are generally inconsistent as they do not correctly adjust for dependencies in the past.
We provide methods, prove their consistency, and show how the inferred causal effect can be used for distribution generalization.
arXiv Detail & Related papers (2022-03-11T16:29:48Z) - Adjusting for Autocorrelated Errors in Neural Networks for Time Series
Regression and Forecasting [10.659189276058948]
We learn the autocorrelation coefficient jointly with the model parameters in order to adjust for autocorrelated errors.
For time series regression, large-scale experiments indicate that our method outperforms the Prais-Winsten method.
Results across a wide range of real-world datasets show that our method enhances performance in almost all cases.
arXiv Detail & Related papers (2021-01-28T04:25:51Z) - Gaussian MRF Covariance Modeling for Efficient Black-Box Adversarial
Attacks [86.88061841975482]
We study the problem of generating adversarial examples in a black-box setting, where we only have access to a zeroth order oracle.
We use this setting to find fast one-step adversarial attacks, akin to a black-box version of the Fast Gradient Sign Method(FGSM)
We show that the method uses fewer queries and achieves higher attack success rates than the current state of the art.
arXiv Detail & Related papers (2020-10-08T18:36:51Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.