Rhino: Deep Causal Temporal Relationship Learning With History-dependent
Noise
- URL: http://arxiv.org/abs/2210.14706v1
- Date: Wed, 26 Oct 2022 13:33:58 GMT
- Title: Rhino: Deep Causal Temporal Relationship Learning With History-dependent
Noise
- Authors: Wenbo Gong, Joel Jennings, Cheng Zhang and Nick Pawlowski
- Abstract summary: We propose a novel causal relationship learning framework for time-series data, called Rhino.
Rhino combines vector auto-regression, deep learning and variational inference to model non-linear relationships with instantaneous effects.
Theoretically, we prove the structural identifiability of Rhino.
- Score: 13.709618907099783
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Discovering causal relationships between different variables from time series
data has been a long-standing challenge for many domains such as climate
science, finance, and healthcare. Given the complexity of real-world
relationships and the nature of observations in discrete time, causal discovery
methods need to consider non-linear relations between variables, instantaneous
effects and history-dependent noise (the change of noise distribution due to
past actions). However, previous works do not offer a solution addressing all
these problems together. In this paper, we propose a novel causal relationship
learning framework for time-series data, called Rhino, which combines vector
auto-regression, deep learning and variational inference to model non-linear
relationships with instantaneous effects while allowing the noise distribution
to be modulated by historical observations. Theoretically, we prove the
structural identifiability of Rhino. Our empirical results from extensive
synthetic experiments and two real-world benchmarks demonstrate better
discovery performance compared to relevant baselines, with ablation studies
revealing its robustness under model misspecification.
Related papers
- Causal Discovery from Time-Series Data with Short-Term Invariance-Based Convolutional Neural Networks [12.784885649573994]
Causal discovery from time-series data aims to capture both intra-slice (contemporaneous) and inter-slice (time-lagged) causality.
We propose a novel gradient-based causal discovery approach STIC, which focuses on textbfShort-textbfTerm textbfInvariance using textbfConvolutional neural networks.
arXiv Detail & Related papers (2024-08-15T08:43:28Z) - On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - Identifiable Latent Polynomial Causal Models Through the Lens of Change [82.14087963690561]
Causal representation learning aims to unveil latent high-level causal representations from observed low-level data.
One of its primary tasks is to provide reliable assurance of identifying these latent causal models, known as identifiability.
arXiv Detail & Related papers (2023-10-24T07:46:10Z) - Discovering Mixtures of Structural Causal Models from Time Series Data [23.18511951330646]
We propose a general variational inference-based framework called MCD to infer the underlying causal models.
Our approach employs an end-to-end training process that maximizes an evidence-lower bound for the data likelihood.
We demonstrate that our method surpasses state-of-the-art benchmarks in causal discovery tasks.
arXiv Detail & Related papers (2023-10-10T05:13:10Z) - From Temporal to Contemporaneous Iterative Causal Discovery in the
Presence of Latent Confounders [6.365889364810238]
We present a constraint-based algorithm for learning causal structures from observational time-series data.
We assume a discrete-time, stationary structural vector autoregressive process, with both temporal and contemporaneous causal relations.
The presented algorithm gradually refines a causal graph by learning long-term temporal relations before short-term ones, where contemporaneous relations are learned last.
arXiv Detail & Related papers (2023-06-01T12:46:06Z) - Towards Causal Representation Learning and Deconfounding from Indefinite
Data [17.793702165499298]
Non-statistical data (e.g., images, text, etc.) encounters significant conflicts in terms of properties and methods with traditional causal data.
We redefine causal data from two novel perspectives and then propose three data paradigms.
We implement the above designs as a dynamic variational inference model, tailored to learn causal representation from indefinite data.
arXiv Detail & Related papers (2023-05-04T08:20:37Z) - DOMINO: Visual Causal Reasoning with Time-Dependent Phenomena [59.291745595756346]
We propose a set of visual analytics methods that allow humans to participate in the discovery of causal relations associated with windows of time delay.
Specifically, we leverage a well-established method, logic-based causality, to enable analysts to test the significance of potential causes.
Since an effect can be a cause of other effects, we allow users to aggregate different temporal cause-effect relations found with our method into a visual flow diagram.
arXiv Detail & Related papers (2023-03-12T03:40:21Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z) - Amortized Causal Discovery: Learning to Infer Causal Graphs from
Time-Series Data [63.15776078733762]
We propose Amortized Causal Discovery, a novel framework to learn to infer causal relations from time-series data.
We demonstrate experimentally that this approach, implemented as a variational model, leads to significant improvements in causal discovery performance.
arXiv Detail & Related papers (2020-06-18T19:59:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.