On the Identification of Temporally Causal Representation with Instantaneous Dependence
- URL: http://arxiv.org/abs/2405.15325v2
- Date: Fri, 7 Jun 2024 11:02:49 GMT
- Title: On the Identification of Temporally Causal Representation with Instantaneous Dependence
- Authors: Zijian Li, Yifan Shen, Kaitao Zheng, Ruichu Cai, Xiangchen Song, Mingming Gong, Zhengmao Zhu, Guangyi Chen, Kun Zhang,
- Abstract summary: Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
- Score: 50.14432597910128
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporally causal representation learning aims to identify the latent causal process from time series observations, but most methods require the assumption that the latent causal processes do not have instantaneous relations. Although some recent methods achieve identifiability in the instantaneous causality case, they require either interventions on the latent variables or grouping of the observations, which are in general difficult to obtain in real-world scenarios. To fill this gap, we propose an \textbf{ID}entification framework for instantane\textbf{O}us \textbf{L}atent dynamics (\textbf{IDOL}) by imposing a sparse influence constraint that the latent causal processes have sparse time-delayed and instantaneous relations. Specifically, we establish identifiability results of the latent causal process based on sufficient variability and the sparse influence constraint by employing contextual information of time series data. Based on these theories, we incorporate a temporally variational inference architecture to estimate the latent variables and a gradient-based sparsity regularization to identify the latent causal process. Experimental results on simulation datasets illustrate that our method can identify the latent causal process. Furthermore, evaluations on multiple human motion forecasting benchmarks with instantaneous dependencies indicate the effectiveness of our method in real-world settings.
Related papers
- Temporally Disentangled Representation Learning under Unknown Nonstationarity [35.195001384417395]
We introduce NCTRL, a principled estimation framework, to reconstruct time-delayed latent causal variables.
Empirical evaluations demonstrated the reliable identification of time-delayed latent causal influences.
arXiv Detail & Related papers (2023-10-28T06:46:03Z) - Identifiable Latent Polynomial Causal Models Through the Lens of Change [82.14087963690561]
Causal representation learning aims to unveil latent high-level causal representations from observed low-level data.
One of its primary tasks is to provide reliable assurance of identifying these latent causal models, known as identifiability.
arXiv Detail & Related papers (2023-10-24T07:46:10Z) - Nonparametric Identifiability of Causal Representations from Unknown
Interventions [63.1354734978244]
We study causal representation learning, the task of inferring latent causal variables and their causal relations from mixtures of the variables.
Our goal is to identify both the ground truth latents and their causal graph up to a set of ambiguities which we show to be irresolvable from interventional data.
arXiv Detail & Related papers (2023-06-01T10:51:58Z) - DOMINO: Visual Causal Reasoning with Time-Dependent Phenomena [59.291745595756346]
We propose a set of visual analytics methods that allow humans to participate in the discovery of causal relations associated with windows of time delay.
Specifically, we leverage a well-established method, logic-based causality, to enable analysts to test the significance of potential causes.
Since an effect can be a cause of other effects, we allow users to aggregate different temporal cause-effect relations found with our method into a visual flow diagram.
arXiv Detail & Related papers (2023-03-12T03:40:21Z) - Identifying Weight-Variant Latent Causal Models [82.14087963690561]
We find that transitivity acts as a key role in impeding the identifiability of latent causal representations.
Under some mild assumptions, we can show that the latent causal representations can be identified up to trivial permutation and scaling.
We propose a novel method, termed Structural caUsAl Variational autoEncoder, which directly learns latent causal representations and causal relationships among them.
arXiv Detail & Related papers (2022-08-30T11:12:59Z) - Causal Discovery from Conditionally Stationary Time Series [18.645887749731923]
State-Dependent Causal Inference (SDCI) is able to recover the underlying causal dependencies, provably with fully-observed states and empirically with hidden states.
improved results over non-causal RNNs on modeling NBA player movements demonstrate the potential of our method.
arXiv Detail & Related papers (2021-10-12T18:12:57Z) - Learning Temporally Causal Latent Processes from General Temporal Data [22.440008291454287]
We propose two provable conditions under which temporally causal latent processes can be identified from their nonlinear mixtures.
Experimental results on various data sets demonstrate that temporally causal latent processes are reliably identified from observed variables.
arXiv Detail & Related papers (2021-10-11T17:16:19Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.