TiVaT: A Transformer with a Single Unified Mechanism for Capturing Asynchronous Dependencies in Multivariate Time Series Forecasting
- URL: http://arxiv.org/abs/2410.01531v2
- Date: Fri, 31 Jan 2025 02:32:39 GMT
- Title: TiVaT: A Transformer with a Single Unified Mechanism for Capturing Asynchronous Dependencies in Multivariate Time Series Forecasting
- Authors: Junwoo Ha, Hyukjae Kwon, Sungsoo Kim, Kisu Lee, Seungjae Park, Ha Young Kim,
- Abstract summary: TiVaT is a novel architecture incorporating a single unified module, a Joint-Axis (JA) attention module.
The JA attention module dynamically selects relevant features to particularly capture asynchronous interactions.
Extensive experiments demonstrate TiVaT's overall performance across diverse datasets.
- Score: 4.733959271565453
- License:
- Abstract: Multivariate time series (MTS) forecasting is vital across various domains but remains challenging due to the need to simultaneously model temporal and inter-variate dependencies. Existing channel-dependent models, where Transformer-based models dominate, process these dependencies separately, limiting their capacity to capture complex interactions such as lead-lag dynamics. To address this issue, we propose TiVaT (Time-variate Transformer), a novel architecture incorporating a single unified module, a Joint-Axis (JA) attention module, that concurrently processes temporal and variate modeling. The JA attention module dynamically selects relevant features to particularly capture asynchronous interactions. In addition, we introduce distance-aware time-variate sampling in the JA attention, a novel mechanism that extracts significant patterns through a learned 2D embedding space while reducing noise. Extensive experiments demonstrate TiVaT's overall performance across diverse datasets, particularly excelling in scenarios with intricate asynchronous dependencies.
Related papers
- TimeCNN: Refining Cross-Variable Interaction on Time Point for Time Series Forecasting [44.04862924157323]
Transformer-based models demonstrate significant potential in modeling cross-time and cross-variable interaction.
We propose a TimeCNN model to refine cross-variable interactions to enhance time series forecasting.
Extensive experiments conducted on 12 real-world datasets demonstrate that TimeCNN consistently outperforms state-of-the-art models.
arXiv Detail & Related papers (2024-10-07T09:16:58Z) - TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model [11.281386703572842]
TimeDiT is a diffusion transformer model that combines temporal dependency learning with probabilistic sampling.
TimeDiT employs a unified masking mechanism to harmonize the training and inference process across diverse tasks.
Our systematic evaluation demonstrates TimeDiT's effectiveness both in fundamental tasks, i.e., forecasting and imputation, through zero-shot/fine-tuning.
arXiv Detail & Related papers (2024-09-03T22:31:57Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - CATS: Enhancing Multivariate Time Series Forecasting by Constructing
Auxiliary Time Series as Exogenous Variables [9.95711569148527]
We introduce a method to Construct Auxiliary Time Series (CATS) that functions like a 2D temporal-contextual attention mechanism.
Even with a basic 2-layer as core predictor, CATS achieves state-of-the-art, significantly reducing complexity and parameters compared to previous multivariate models.
arXiv Detail & Related papers (2024-03-04T01:52:40Z) - A Decoupled Spatio-Temporal Framework for Skeleton-based Action
Segmentation [89.86345494602642]
Existing methods are limited in weak-temporal modeling capability.
We propose a Decoupled Scoupled Framework (DeST) to address the issues.
DeST significantly outperforms current state-of-the-art methods with less computational complexity.
arXiv Detail & Related papers (2023-12-10T09:11:39Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Multi-Scale Adaptive Graph Neural Network for Multivariate Time Series
Forecasting [8.881348323807158]
We propose a multi-scale adaptive graph neural network (MAGNN) to address the above issue.
Experiments on four real-world datasets demonstrate that MAGNN outperforms the state-of-the-art methods across various settings.
arXiv Detail & Related papers (2022-01-13T08:04:10Z) - Deep Explicit Duration Switching Models for Time Series [84.33678003781908]
We propose a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
State-dependent switching is enabled by a recurrent state-to-switch connection.
An explicit duration count variable is used to improve the time-dependent switching behavior.
arXiv Detail & Related papers (2021-10-26T17:35:21Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.