Caformer: Rethinking Time Series Analysis from Causal Perspective
- URL: http://arxiv.org/abs/2403.08572v1
- Date: Wed, 13 Mar 2024 14:28:02 GMT
- Title: Caformer: Rethinking Time Series Analysis from Causal Perspective
- Authors: Kexuan Zhang, Xiaobei Zou, Yang Tang
- Abstract summary: We introduce a novel framework called Caformer for time series analysis from a causal perspective.
Our framework comprises three components: Dynamic Learner, Environment Learner, and Dependency Learner.
Our Caformer demonstrates consistent state-of-the-art performance across five mainstream time series analysis tasks.
- Score: 7.354128514581098
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series analysis is a vital task with broad applications in various
domains. However, effectively capturing cross-dimension and cross-time
dependencies in non-stationary time series poses significant challenges,
particularly in the context of environmental factors. The spurious correlation
induced by the environment confounds the causal relationships between
cross-dimension and cross-time dependencies. In this paper, we introduce a
novel framework called Caformer (\underline{\textbf{Ca}}usal
Trans\underline{\textbf{former}}) for time series analysis from a causal
perspective. Specifically, our framework comprises three components: Dynamic
Learner, Environment Learner, and Dependency Learner. The Dynamic Learner
unveils dynamic interactions among dimensions, the Environment Learner
mitigates spurious correlations caused by environment with a back-door
adjustment, and the Dependency Learner aims to infer robust interactions across
both time and dimensions. Our Caformer demonstrates consistent state-of-the-art
performance across five mainstream time series analysis tasks, including long-
and short-term forecasting, imputation, classification, and anomaly detection,
with proper interpretability.
Related papers
- Higher-order Cross-structural Embedding Model for Time Series Analysis [12.35149125898563]
Time series analysis has gained significant attention due to its critical applications in diverse fields such as healthcare, finance, and sensor networks.
Current approaches struggle to model higher-order interactions within time series, and focus on learning temporal or spatial dependencies separately.
We propose Higher-order Cross-structural Embedding Model for Time Series (High-TS), a novel framework that jointly models both temporal and spatial perspectives.
arXiv Detail & Related papers (2024-10-30T12:51:14Z) - Decoupled Marked Temporal Point Process using Neural Ordinary Differential Equations [14.828081841581296]
A Marked Temporal Point Process (MTPP) is a process whose realization is a set of event-time data.
Recent studies have utilized deep neural networks to capture complex temporal dependencies of events.
We propose a Decoupled MTPP framework that disentangles characterization of a process into a set of evolving influences from different events.
arXiv Detail & Related papers (2024-06-10T10:15:32Z) - On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - Learning Time-aware Graph Structures for Spatially Correlated Time
Series Forecasting [30.93275270960829]
We propose Time-aware Graph Structure Learning (TagSL), which extracts time-aware correlations among time series.
We also present a Graph Convolution-based Gated Recurrent Unit (GCGRU), that jointly captures spatial and temporal dependencies.
Finally, we introduce a unified framework named Time-aware Graph Convolutional Recurrent Network (TGCRN), combining TagSL, GCGRU in an encoder-decoder architecture for multi-step-temporal forecasting.
arXiv Detail & Related papers (2023-12-27T04:23:43Z) - Critical Learning Periods Emerge Even in Deep Linear Networks [102.89011295243334]
Critical learning periods are periods early in development where temporary sensory deficits can have a permanent effect on behavior and learned representations.
Despite the radical differences between biological and artificial networks, critical learning periods have been empirically observed in both systems.
arXiv Detail & Related papers (2023-08-23T16:01:50Z) - Correlation-aware Spatial-Temporal Graph Learning for Multivariate
Time-series Anomaly Detection [67.60791405198063]
We propose a correlation-aware spatial-temporal graph learning (termed CST-GL) for time series anomaly detection.
CST-GL explicitly captures the pairwise correlations via a multivariate time series correlation learning module.
A novel anomaly scoring component is further integrated into CST-GL to estimate the degree of an anomaly in a purely unsupervised manner.
arXiv Detail & Related papers (2023-07-17T11:04:27Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - Path Signature Area-Based Causal Discovery in Coupled Time Series [0.0]
We propose the application of confidence sequences to analyze the significance of the magnitude of the signed area between two variables.
This approach provides a new way to define the confidence of a causal link existing between two time series.
arXiv Detail & Related papers (2021-10-23T19:57:22Z) - Decoupling Long- and Short-Term Patterns in Spatiotemporal Inference [31.245426664456257]
It is impractical to deploy massive sensors due to the expensive costs.
How to get fine-grained data measurement has long been a pressing issue.
We propose a graphtemporal attention network to learn the relations across space and time for short-term patterns.
arXiv Detail & Related papers (2021-09-16T03:06:31Z) - Learning Temporal Dynamics from Cycles in Narrated Video [85.89096034281694]
We propose a self-supervised solution to the problem of learning to model how the world changes as time elapses.
Our model learns modality-agnostic functions to predict forward and backward in time, which must undo each other when composed.
We apply the learned dynamics model without further training to various tasks, such as predicting future action and temporally ordering sets of images.
arXiv Detail & Related papers (2021-01-07T02:41:32Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.