TrAISformer -- A Transformer Network with Sparse Augmented Data
Representation and Cross Entropy Loss for AIS-based Vessel Trajectory
Prediction
- URL: http://arxiv.org/abs/2109.03958v4
- Date: Wed, 3 Jan 2024 14:22:51 GMT
- Title: TrAISformer -- A Transformer Network with Sparse Augmented Data
Representation and Cross Entropy Loss for AIS-based Vessel Trajectory
Prediction
- Authors: Duong Nguyen and Ronan Fablet
- Abstract summary: Vessel trajectory prediction plays a pivotal role in numerous maritime applications and services.
forecasting vessel trajectory using AIS data remains challenging, even for modern machine learning techniques.
We introduce a discrete, high-dimensional representation of AIS data and a new loss function designed to explicitly address heterogeneous and multimodality.
We report experimental results on real, publicly available AIS data. TrAISformer significantly outperforms state-of-the-art methods, with an average prediction performance below 10 nautical miles up to 10 hours.
- Score: 9.281166430457647
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Vessel trajectory prediction plays a pivotal role in numerous maritime
applications and services. While the Automatic Identification System (AIS)
offers a rich source of information to address this task, forecasting vessel
trajectory using AIS data remains challenging, even for modern machine learning
techniques, because of the inherent heterogeneous and multimodal nature of
motion data. In this paper, we propose a novel approach to tackle these
challenges. We introduce a discrete, high-dimensional representation of AIS
data and a new loss function designed to explicitly address heterogeneity and
multimodality. The proposed model-referred to as TrAISformer-is a modified
transformer network that extracts long-term temporal patterns in AIS vessel
trajectories in the proposed enriched space to forecast the positions of
vessels several hours ahead. We report experimental results on real, publicly
available AIS data. TrAISformer significantly outperforms state-of-the-art
methods, with an average prediction performance below 10 nautical miles up to
~10 hours.
Related papers
- DST-TransitNet: A Dynamic Spatio-Temporal Deep Learning Model for Scalable and Efficient Network-Wide Prediction of Station-Level Transit Ridership [12.6020349733674]
This paper introduces DST-TransitNet, a hybrid Deep Learning model for system-wide ridership prediction.
It is tested on Bogota's BRT system data, with three distinct social scenarios.
It outperforms state-of-the-art models in precision, efficiency and robustness.
arXiv Detail & Related papers (2024-10-19T06:59:39Z) - Physics-guided Active Sample Reweighting for Urban Flow Prediction [75.24539704456791]
Urban flow prediction is a nuanced-temporal modeling that estimates the throughput of transportation services like buses, taxis and ride-driven models.
Some recent prediction solutions bring remedies with the notion of physics-guided machine learning (PGML)
We develop a atized physics-guided network (PN), and propose a data-aware framework Physics-guided Active Sample Reweighting (P-GASR)
arXiv Detail & Related papers (2024-07-18T15:44:23Z) - A Multi-Channel Spatial-Temporal Transformer Model for Traffic Flow Forecasting [0.0]
We propose a multi-channel spatial-temporal transformer model for traffic flow forecasting.
It improves the accuracy of the prediction by fusing results from different channels of traffic data.
Experimental results on six real-world datasets demonstrate that introducing a multi-channel mechanism into the temporal model enhances performance.
arXiv Detail & Related papers (2024-05-10T06:37:07Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - ST-former for short-term passenger flow prediction during COVID-19 in
urban rail transit system [8.506559986635057]
How to dynamically model the complex dependencies of passenger flow is the main issue in achieving accurate passenger flow prediction during the epidemic.
This paper proposes a transformer-based architecture called STformer under the encoderde-coder framework specifically for COVID-19.
Experiments on real-world passenger flow datasets demonstrate the superiority of ST-former over the other eleven state-of-the-art methods.
arXiv Detail & Related papers (2022-10-14T01:51:33Z) - A Data Driven Method for Multi-step Prediction of Ship Roll Motion in
High Sea States [15.840386459188169]
This paper presents a novel data-driven methodology for achieving the multi-step prediction of ship roll motion in high sea states.
A hybrid neural network, named ConvLSPTMNet, is proposed to execute long short-term memory (LSTM) and one-dimensional convolutional neural networks (CNN)
The results demonstrate that ConvNet achieves more accurate than LSTM and CNN methods in multi-step prediction of roll motion.
arXiv Detail & Related papers (2022-07-26T06:26:00Z) - CTIN: Robust Contextual Transformer Network for Inertial Navigation [20.86392550313961]
We propose a robust Con Transformer-based network for Inertial Navigation(CTIN) to accurately predict velocity and trajectory.
CTIN is very robust and outperforms state-of-the-art models.
arXiv Detail & Related papers (2021-12-03T19:57:34Z) - DAE : Discriminatory Auto-Encoder for multivariate time-series anomaly
detection in air transportation [68.8204255655161]
We propose a novel anomaly detection model called Discriminatory Auto-Encoder (DAE)
It uses the baseline of a regular LSTM-based auto-encoder but with several decoders, each getting data of a specific flight phase.
Results show that the DAE achieves better results in both accuracy and speed of detection.
arXiv Detail & Related papers (2021-09-08T14:07:55Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - SMART: Simultaneous Multi-Agent Recurrent Trajectory Prediction [72.37440317774556]
We propose advances that address two key challenges in future trajectory prediction.
multimodality in both training data and predictions and constant time inference regardless of number of agents.
arXiv Detail & Related papers (2020-07-26T08:17:10Z) - FMA-ETA: Estimating Travel Time Entirely Based on FFN With Attention [88.33372574562824]
We propose a novel framework based on feed-forward network (FFN) for ETA, FFN with Multi-factor self-Attention (FMA-ETA)
The novel Multi-factor self-attention mechanism is proposed to deal with different category features and aggregate the information purposefully.
Experiments show FMA-ETA is competitive with state-of-the-art methods in terms of the prediction accuracy with significantly better inference speed.
arXiv Detail & Related papers (2020-06-07T08:10:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.