CDANs: Temporal Causal Discovery from Autocorrelated and Non-Stationary
Time Series Data
- URL: http://arxiv.org/abs/2302.03246v2
- Date: Mon, 16 Oct 2023 22:27:47 GMT
- Title: CDANs: Temporal Causal Discovery from Autocorrelated and Non-Stationary
Time Series Data
- Authors: Muhammad Hasan Ferdous, Uzma Hasan, Md Osman Gani
- Abstract summary: Causal discovery holds the potential to play a significant role in extracting actionable insights about human health.
We present a novel constraint-based causal discovery approach for autocorrelated and non-stationary time series data.
Our approach identifies lagged and instantaneous/contemporaneous causal relationships along with changing modules that vary over time.
- Score: 5.130175508025212
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series data are found in many areas of healthcare such as medical time
series, electronic health records (EHR), measurements of vitals, and wearable
devices. Causal discovery, which involves estimating causal relationships from
observational data, holds the potential to play a significant role in
extracting actionable insights about human health. In this study, we present a
novel constraint-based causal discovery approach for autocorrelated and
non-stationary time series data (CDANs). Our proposed method addresses several
limitations of existing causal discovery methods for autocorrelated and
non-stationary time series data, such as high dimensionality, the inability to
identify lagged causal relationships, and overlooking changing modules. Our
approach identifies lagged and instantaneous/contemporaneous causal
relationships along with changing modules that vary over time. The method
optimizes the conditioning sets in a constraint-based search by considering
lagged parents instead of conditioning on the entire past that addresses high
dimensionality. The changing modules are detected by considering both
contemporaneous and lagged parents. The approach first detects the lagged
adjacencies, then identifies the changing modules and contemporaneous
adjacencies, and finally determines the causal direction. We extensively
evaluated our proposed method on synthetic and real-world clinical datasets,
and compared its performance with several baseline approaches. The experimental
results demonstrate the effectiveness of the proposed method in detecting
causal relationships and changing modules for autocorrelated and non-stationary
time series data.
Related papers
- On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - Causal discovery for time series with constraint-based model and PMIME
measure [0.0]
We present a novel approach for discovering causality in time series data that combines a causal discovery algorithm with an information theoretic-based measure.
We evaluate the performance of our approach on several simulated data sets, showing promising results.
arXiv Detail & Related papers (2023-05-31T09:38:50Z) - eCDANs: Efficient Temporal Causal Discovery from Autocorrelated and
Non-stationary Data (Student Abstract) [0.3314882635954752]
We present a constraint-based CD approach for autocorrelated and non-stationary time series data (eCDANs)
eCDANs can detect lagged and contemporaneous causal relationships along with temporal changes.
Experiments on synthetic and real-world data show that eCDANs can identify time influence and outperform the baselines.
arXiv Detail & Related papers (2023-03-06T01:59:45Z) - DynImp: Dynamic Imputation for Wearable Sensing Data Through Sensory and
Temporal Relatedness [78.98998551326812]
We argue that traditional methods have rarely made use of both times-series dynamics of the data as well as the relatedness of the features from different sensors.
We propose a model, termed as DynImp, to handle different time point's missingness with nearest neighbors along feature axis.
We show that the method can exploit the multi-modality features from related sensors and also learn from history time-series dynamics to reconstruct the data under extreme missingness.
arXiv Detail & Related papers (2022-09-26T21:59:14Z) - Causality-Based Multivariate Time Series Anomaly Detection [63.799474860969156]
We formulate the anomaly detection problem from a causal perspective and view anomalies as instances that do not follow the regular causal mechanism to generate the multivariate data.
We then propose a causality-based anomaly detection approach, which first learns the causal structure from data and then infers whether an instance is an anomaly relative to the local causal mechanism.
We evaluate our approach with both simulated and public datasets as well as a case study on real-world AIOps applications.
arXiv Detail & Related papers (2022-06-30T06:00:13Z) - Path Signature Area-Based Causal Discovery in Coupled Time Series [0.0]
We propose the application of confidence sequences to analyze the significance of the magnitude of the signed area between two variables.
This approach provides a new way to define the confidence of a causal link existing between two time series.
arXiv Detail & Related papers (2021-10-23T19:57:22Z) - Causal Discovery from Conditionally Stationary Time Series [18.645887749731923]
State-Dependent Causal Inference (SDCI) is able to recover the underlying causal dependencies, provably with fully-observed states and empirically with hidden states.
improved results over non-causal RNNs on modeling NBA player movements demonstrate the potential of our method.
arXiv Detail & Related papers (2021-10-12T18:12:57Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - High-recall causal discovery for autocorrelated time series with latent
confounders [12.995632804090198]
We show that existing causal discovery methods such as FCI and variants suffer from low recall in the autocorrelated time series case.
We provide Python code for all methods involved in the simulation studies.
arXiv Detail & Related papers (2020-07-03T18:01:04Z) - Amortized Causal Discovery: Learning to Infer Causal Graphs from
Time-Series Data [63.15776078733762]
We propose Amortized Causal Discovery, a novel framework to learn to infer causal relations from time-series data.
We demonstrate experimentally that this approach, implemented as a variational model, leads to significant improvements in causal discovery performance.
arXiv Detail & Related papers (2020-06-18T19:59:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.