Revisiting the Temporal Modeling in Spatio-Temporal Predictive Learning
under A Unified View
- URL: http://arxiv.org/abs/2310.05829v1
- Date: Mon, 9 Oct 2023 16:17:42 GMT
- Title: Revisiting the Temporal Modeling in Spatio-Temporal Predictive Learning
under A Unified View
- Authors: Cheng Tan, Jue Wang, Zhangyang Gao, Siyuan Li, Lirong Wu, Jun Xia,
Stan Z. Li
- Abstract summary: We introduce USTEP (Unified S-TEmporal Predictive learning), an innovative framework that reconciles the recurrent-based and recurrent-free methods by integrating both micro-temporal and macro-temporal scales.
- Score: 73.73667848619343
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spatio-temporal predictive learning plays a crucial role in self-supervised
learning, with wide-ranging applications across a diverse range of fields.
Previous approaches for temporal modeling fall into two categories:
recurrent-based and recurrent-free methods. The former, while meticulously
processing frames one by one, neglect short-term spatio-temporal information
redundancies, leading to inefficiencies. The latter naively stack frames
sequentially, overlooking the inherent temporal dependencies. In this paper, we
re-examine the two dominant temporal modeling approaches within the realm of
spatio-temporal predictive learning, offering a unified perspective. Building
upon this analysis, we introduce USTEP (Unified Spatio-TEmporal Predictive
learning), an innovative framework that reconciles the recurrent-based and
recurrent-free methods by integrating both micro-temporal and macro-temporal
scales. Extensive experiments on a wide range of spatio-temporal predictive
learning demonstrate that USTEP achieves significant improvements over existing
temporal modeling approaches, thereby establishing it as a robust solution for
a wide range of spatio-temporal applications.
Related papers
- PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Spatiotemporal Observer Design for Predictive Learning of
High-Dimensional Data [6.214987339902511]
An observer theory-guided deep learning architecture, called Stemporal, is designed for predictive learning Observer high dimensional data.
This framework could capture thetemporaltemporal dynamics make accurate predictions in both one-step and multi-step-ahead scenarios.
arXiv Detail & Related papers (2024-02-23T12:28:31Z) - Triplet Attention Transformer for Spatiotemporal Predictive Learning [9.059462850026216]
We propose an innovative triplet attention transformer designed to capture both inter-frame dynamics and intra-frame static features.
The model incorporates the Triplet Attention Module (TAM), which replaces traditional recurrent units by exploring self-attention mechanisms in temporal, spatial, and channel dimensions.
arXiv Detail & Related papers (2023-10-28T12:49:33Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Temporal Attention Unit: Towards Efficient Spatiotemporal Predictive
Learning [42.22064610886404]
We present a general framework of predictive learning, in which the encoder and decoder capture intra-frame features and the middle temporal module catches inter-frame dependencies.
To parallelize the temporal module, we propose the Temporal Attention Unit (TAU), which decomposes the temporal attention into intraframe statical attention and inter-frame dynamical attention.
arXiv Detail & Related papers (2022-06-24T07:43:50Z) - Towards Spatio-Temporal Aware Traffic Time Series Forecasting--Full
Version [37.09531298150374]
Traffic series forecasting is challenging due to complex time series patterns for the same time series patterns may vary across time, where, for example, there exist periods across a day showing stronger temporal correlations.
Such-temporal models employ a shared parameter space irrespective of the time locations and the time periods and they assume that the temporal correlations are similar across locations and do not always hold across time which may not always be the case.
We propose a framework that aims at turning ICD-temporal aware models to encode sub-temporal models.
arXiv Detail & Related papers (2022-03-29T16:44:56Z) - Self-Attention Neural Bag-of-Features [103.70855797025689]
We build on the recently introduced 2D-Attention and reformulate the attention learning methodology.
We propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information.
arXiv Detail & Related papers (2022-01-26T17:54:14Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - A Comprehensive Study on Temporal Modeling for Online Action Detection [50.558313106389335]
Online action detection (OAD) is a practical yet challenging task, which has attracted increasing attention in recent years.
This paper aims to provide a comprehensive study on temporal modeling for OAD including four meta types of temporal modeling methods.
We present several hybrid temporal modeling methods, which outperform the recent state-of-the-art methods with sizable margins on THUMOS-14 and TVSeries.
arXiv Detail & Related papers (2020-01-21T13:12:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.