S2TNet: Spatio-Temporal Transformer Networks for Trajectory Prediction
in Autonomous Driving
- URL: http://arxiv.org/abs/2206.10902v1
- Date: Wed, 22 Jun 2022 08:12:31 GMT
- Title: S2TNet: Spatio-Temporal Transformer Networks for Trajectory Prediction
in Autonomous Driving
- Authors: Weihuang Chen and Fangfang Wang and Hongbin Sun
- Abstract summary: We propose S2TNet, which models the spatial-temporal interactions by the S-temporal Transformer and deals with the temporel sequences by temporal Transformer.
The methods outperforms state-of-the-art methods on ApolloScape Trajectory dataset by more than 7% on both the weighted sum of Average and Final Displacement Error.
- Score: 7.862992905548721
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To safely and rationally participate in dense and heterogeneous traffic,
autonomous vehicles require to sufficiently analyze the motion patterns of
surrounding traffic-agents and accurately predict their future trajectories.
This is challenging because the trajectories of traffic-agents are not only
influenced by the traffic-agents themselves but also by spatial interaction
with each other. Previous methods usually rely on the sequential step-by-step
processing of Long Short-Term Memory networks (LSTMs) and merely extract the
interactions between spatial neighbors for single type traffic-agents. We
propose the Spatio-Temporal Transformer Networks (S2TNet), which models the
spatio-temporal interactions by spatio-temporal Transformer and deals with the
temporel sequences by temporal Transformer. We input additional category, shape
and heading information into our networks to handle the heterogeneity of
traffic-agents. The proposed methods outperforms state-of-the-art methods on
ApolloScape Trajectory dataset by more than 7\% on both the weighted sum of
Average and Final Displacement Error. Our code is available at
https://github.com/chenghuang66/s2tnet.
Related papers
- Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - Transport-Hub-Aware Spatial-Temporal Adaptive Graph Transformer for
Traffic Flow Prediction [10.722455633629883]
We propose a Transport-Hub-aware spatial-temporal adaptive graph transFormer for traffic flow prediction.
Specifically, we first design a novel spatial self-attention module to capture the dynamic spatial dependencies.
We also employ a temporal self-attention module to detect dynamic temporal patterns in the traffic flow data.
arXiv Detail & Related papers (2023-10-12T13:44:35Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Correlating sparse sensing for large-scale traffic speed estimation: A
Laplacian-enhanced low-rank tensor kriging approach [76.45949280328838]
We propose a Laplacian enhanced low-rank tensor (LETC) framework featuring both lowrankness and multi-temporal correlations for large-scale traffic speed kriging.
We then design an efficient solution algorithm via several effective numeric techniques to scale up the proposed model to network-wide kriging.
arXiv Detail & Related papers (2022-10-21T07:25:57Z) - STCGAT: Spatial-temporal causal networks for complex urban road traffic
flow prediction [12.223433627287605]
Traffic data are highly nonlinear and have complex spatial correlations between road nodes.
Existing approaches usually use fixed traffic road network topology maps and independent time series modules to capture Spatial-temporal correlations.
We propose a new prediction model which captures the spatial dependence of the traffic network through a Graph Attention Network(GAT) and then analyzes the causal relationship of the traffic data.
arXiv Detail & Related papers (2022-03-21T06:38:34Z) - Space Meets Time: Local Spacetime Neural Network For Traffic Flow
Forecasting [11.495992519252585]
We argue that such correlations are universal and play a pivotal role in traffic flow.
We propose a new spacetime interval learning framework that constructs a local-spacetime context of a traffic sensor.
The proposed STNN model can be applied on any unseen traffic networks.
arXiv Detail & Related papers (2021-09-11T09:04:35Z) - Learning dynamic and hierarchical traffic spatiotemporal features with
Transformer [4.506591024152763]
This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph modeling and long-term traffic forecasting.
Transformer is the most popular framework in Natural Language Processing (NLP)
analyzing the attention weight matrixes can find the influential part of road networks, allowing us to learn the traffic networks better.
arXiv Detail & Related papers (2021-04-12T02:29:58Z) - Spatial-Channel Transformer Network for Trajectory Prediction on the
Traffic Scenes [2.7955111755177695]
We present a Spatial-Channel Transformer Network for trajectory prediction with attention functions.
A channel-wise module is inserted to measure the social interaction between agents.
We find that the network achieves promising results on real-world trajectory prediction datasets on the traffic scenes.
arXiv Detail & Related papers (2021-01-27T15:03:42Z) - End-to-end Contextual Perception and Prediction with Interaction
Transformer [79.14001602890417]
We tackle the problem of detecting objects in 3D and forecasting their future motion in the context of self-driving.
To capture their spatial-temporal dependencies, we propose a recurrent neural network with a novel Transformer architecture.
Our model can be trained end-to-end, and runs in real-time.
arXiv Detail & Related papers (2020-08-13T14:30:12Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.