ODformer: Spatial-Temporal Transformers for Long Sequence
Origin-Destination Matrix Forecasting Against Cross Application Scenario
- URL: http://arxiv.org/abs/2208.08218v1
- Date: Wed, 17 Aug 2022 10:58:46 GMT
- Title: ODformer: Spatial-Temporal Transformers for Long Sequence
Origin-Destination Matrix Forecasting Against Cross Application Scenario
- Authors: Jin Huang, Bosong Huang, Weihao Yu, Jing Xiao, Ruzhong Xie, Ke Ruan
- Abstract summary: Origin-Destination (OD) record directional flow data between pairs of OD regions.
Most of the related methods are designed for very short sequence time series forecasting in specific application scenarios.
We propose a Transformer-like model named ODformer, with two salient characteristics.
- Score: 18.241804329382543
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Origin-Destination (OD) matrices record directional flow data between pairs
of OD regions. The intricate spatiotemporal dependency in the matrices makes
the OD matrix forecasting (ODMF) problem not only intractable but also
non-trivial. However, most of the related methods are designed for very short
sequence time series forecasting in specific application scenarios, which
cannot meet the requirements of the variation in scenarios and forecasting
length of practical applications. To address these issues, we propose a
Transformer-like model named ODformer, with two salient characteristics: (i)
the novel OD Attention mechanism, which captures special spatial dependencies
between OD pairs of the same origin (destination), greatly improves the ability
of the model to predict cross-application scenarios after combining with 2D-GCN
that captures spatial dependencies between OD regions. (ii) a PeriodSparse
Self-attention that effectively forecasts long sequence OD matrix series while
adapting to the periodic differences in different scenarios. Generous
experiments in three application backgrounds (i.e., transportation traffic, IP
backbone network traffic, crowd flow) show our method outperforms the
state-of-the-art methods.
Related papers
- Double-Path Adaptive-correlation Spatial-Temporal Inverted Transformer for Stock Time Series Forecasting [1.864621482724548]
We propose a Double-Path Adaptive-correlation Spatial-Temporal Inverted Transformer (DPA-STIFormer) to more comprehensively extract dynamic spatial information from stock data.
Experiments conducted on four stock market datasets demonstrate state-of-the-art results, validating the model's superior capability in uncovering latent temporal-correlation patterns.
arXiv Detail & Related papers (2024-09-24T01:53:22Z) - Multi-Source and Test-Time Domain Adaptation on Multivariate Signals using Spatio-Temporal Monge Alignment [59.75420353684495]
Machine learning applications on signals such as computer vision or biomedical data often face challenges due to the variability that exists across hardware devices or session recordings.
In this work, we propose Spatio-Temporal Monge Alignment (STMA) to mitigate these variabilities.
We show that STMA leads to significant and consistent performance gains between datasets acquired with very different settings.
arXiv Detail & Related papers (2024-07-19T13:33:38Z) - A DeepLearning Framework for Dynamic Estimation of Origin-Destination
Sequence [63.70447384033326]
This paper proposes an integrated method, which uses deep learning methods to infer the structure of OD sequence and uses structural constraints to guide traditional numerical optimization.
Our experiments show that the neural network can effectively infer the structure of the OD sequence and provide practical constraints for numerical optimization to obtain better results.
arXiv Detail & Related papers (2023-07-11T04:58:45Z) - Continuous-Time Modeling of Counterfactual Outcomes Using Neural
Controlled Differential Equations [84.42837346400151]
Estimating counterfactual outcomes over time has the potential to unlock personalized healthcare.
Existing causal inference approaches consider regular, discrete-time intervals between observations and treatment decisions.
We propose a controllable simulation environment based on a model of tumor growth for a range of scenarios.
arXiv Detail & Related papers (2022-06-16T17:15:15Z) - Dynamic Graph Learning Based on Hierarchical Memory for
Origin-Destination Demand Prediction [12.72319550363076]
This paper provides a dynamic graph representation learning framework for OD demands prediction.
In particular, a hierarchical memory updater is first proposed to maintain a time-aware representation for each node.
Atemporal propagation mechanism is provided to aggregate representations of neighbor nodes along a randomtemporal route.
An objective function is designed to derive the future OD demands according to the most recent node.
arXiv Detail & Related papers (2022-05-29T07:52:35Z) - Online Metro Origin-Destination Prediction via Heterogeneous Information
Aggregation [99.54200992904721]
We propose a novel neural network module termed Heterogeneous Information Aggregation Machine (HIAM) to jointly learn the evolutionary patterns of OD and DO ridership.
An OD modeling branch estimates the potential destinations of unfinished orders explicitly to complement the information of incomplete OD matrices.
A DO modeling branch takes DO matrices as input to capture the spatial-temporal distribution of DO ridership.
Based on the proposed HIAM, we develop a unified Seq2Seq network to forecast the future OD and DO ridership simultaneously.
arXiv Detail & Related papers (2021-07-02T10:11:51Z) - SMART: Simultaneous Multi-Agent Recurrent Trajectory Prediction [72.37440317774556]
We propose advances that address two key challenges in future trajectory prediction.
multimodality in both training data and predictions and constant time inference regardless of number of agents.
arXiv Detail & Related papers (2020-07-26T08:17:10Z) - Forecast Network-Wide Traffic States for Multiple Steps Ahead: A Deep
Learning Approach Considering Dynamic Non-Local Spatial Correlation and
Non-Stationary Temporal Dependency [6.019104024723682]
This research studies two particular problems in traffic forecasting: (1) capture the dynamic and non-local spatial correlation between traffic links and (2) model the dynamics of temporal dependency for accurate multiple steps ahead predictions.
We propose a deep learning framework named Spatial-Temporal Sequence to Sequence model (STSeq2Seq) to address these issues.
This model builds on sequence to sequence (seq2seq) architecture to capture temporal feature and relies on graph convolution for aggregating spatial information.
arXiv Detail & Related papers (2020-04-06T03:40:56Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.