Prediction of Time and Distance of Trips Using Explainable
Attention-based LSTMs
- URL: http://arxiv.org/abs/2303.15087v1
- Date: Mon, 27 Mar 2023 10:54:32 GMT
- Title: Prediction of Time and Distance of Trips Using Explainable
Attention-based LSTMs
- Authors: Ebrahim Balouji, Jonas Sj\"oblom, Nikolce Murgovski, Morteza Haghir
Chehreghani
- Abstract summary: We propose machine learning solutions to predict the time of future trips and the possible distance the vehicle will travel.
We use long short-term memory (LSTM)-based structures specifically designed to handle multi-dimensional historical data of trip time and distances simultaneously.
Among the proposed methods, the most advanced one, parallel At-LSTM, predicts the next trip's distance and time with 3.99% error margin.
- Score: 6.07913759162059
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose machine learning solutions to predict the time of
future trips and the possible distance the vehicle will travel. For this
prediction task, we develop and investigate four methods. In the first method,
we use long short-term memory (LSTM)-based structures specifically designed to
handle multi-dimensional historical data of trip time and distances
simultaneously. Using it, we predict the future trip time and forecast the
distance a vehicle will travel by concatenating the outputs of LSTM networks
through fully connected layers. The second method uses attention-based LSTM
networks (At-LSTM) to perform the same tasks. The third method utilizes two
LSTM networks in parallel, one for forecasting the time of the trip and the
other for predicting the distance. The output of each LSTM is then concatenated
through fully connected layers. Finally, the last model is based on two
parallel At-LSTMs, where similarly, each At-LSTM predicts time and distance
separately through fully connected layers. Among the proposed methods, the most
advanced one, i.e., parallel At-LSTM, predicts the next trip's distance and
time with 3.99% error margin where it is 23.89% better than LSTM, the first
method. We also propose TimeSHAP as an explainability method for understanding
how the networks perform learning and model the sequence of information.
Related papers
- Beam Prediction based on Large Language Models [51.45077318268427]
Millimeter-wave (mmWave) communication is promising for next-generation wireless networks but suffers from significant path loss.
Traditional deep learning models, such as long short-term memory (LSTM), enhance beam tracking accuracy however are limited by poor robustness and generalization.
In this letter, we use large language models (LLMs) to improve the robustness of beam prediction.
arXiv Detail & Related papers (2024-08-16T12:40:01Z) - Attention-LSTM for Multivariate Traffic State Prediction on Rural Roads [5.259027520298188]
An Attention based Long Sort-Term Memory model (A-LSTM) is proposed to simultaneously predict traffic volume and speed.
This study compares the results of the A-LSTM model with the Long Short-Term Memory (LSTM) model.
arXiv Detail & Related papers (2023-01-06T22:23:57Z) - Image Classification using Sequence of Pixels [3.04585143845864]
This study compares sequential image classification methods based on recurrent neural networks.
We describe methods based on Long-Short-Term memory(LSTM), bidirectional Long-Short-Term memory(BiLSTM) architectures, etc.
arXiv Detail & Related papers (2022-09-23T09:42:44Z) - Time-to-Green predictions for fully-actuated signal control systems with
supervised learning [56.66331540599836]
This paper proposes a time series prediction framework using aggregated traffic signal and loop detector data.
We utilize state-of-the-art machine learning models to predict future signal phases' duration.
Results based on an empirical data set from a fully-actuated signal control system in Zurich, Switzerland, show that machine learning models outperform conventional prediction methods.
arXiv Detail & Related papers (2022-08-24T07:50:43Z) - LSTMSPLIT: Effective SPLIT Learning based LSTM on Sequential Time-Series
Data [3.9011223632827385]
We propose a new approach, LSTMSPLIT, that uses SL architecture with an LSTM network to classify time-series data with multiple clients.
The proposed method, LSTMSPLIT, has achieved better or reasonable accuracy compared to the Split-1DCNN method using the electrocardiogram dataset and the human activity recognition dataset.
arXiv Detail & Related papers (2022-03-08T11:44:12Z) - End-to-end LSTM based estimation of volcano event epicenter localization [55.60116686945561]
An end-to-end based LSTM scheme is proposed to address the problem of volcano event localization.
LSTM was chosen due to its capability to capture the dynamics of time varying signals.
Results show that the LSTM based architecture provided a success rate, i.e., an error smaller than 1.0Km, equal to 48.5%.
arXiv Detail & Related papers (2021-10-27T17:11:33Z) - Rainfall-Runoff Prediction at Multiple Timescales with a Single Long
Short-Term Memory Network [41.33870234564485]
Long Short-Term Memory Networks (LSTMs) have been applied to daily discharge prediction with remarkable success.
Many practical scenarios, however, require predictions at more granular timescales.
In this study, we propose two Multi-Timescale LSTM (MTS-LSTM) architectures that jointly predict multiple timescales within one model.
We test these models on 516 basins across the continental United States and benchmark against the US National Water Model.
arXiv Detail & Related papers (2020-10-15T17:52:16Z) - SMART: Simultaneous Multi-Agent Recurrent Trajectory Prediction [72.37440317774556]
We propose advances that address two key challenges in future trajectory prediction.
multimodality in both training data and predictions and constant time inference regardless of number of agents.
arXiv Detail & Related papers (2020-07-26T08:17:10Z) - Object Tracking through Residual and Dense LSTMs [67.98948222599849]
Deep learning-based trackers based on LSTMs (Long Short-Term Memory) recurrent neural networks have emerged as a powerful alternative.
DenseLSTMs outperform Residual and regular LSTM, and offer a higher resilience to nuisances.
Our case study supports the adoption of residual-based RNNs for enhancing the robustness of other trackers.
arXiv Detail & Related papers (2020-06-22T08:20:17Z) - Stacked Bidirectional and Unidirectional LSTM Recurrent Neural Network
for Forecasting Network-wide Traffic State with Missing Values [23.504633202965376]
We focus on RNN-based models and attempt to reformulate the way to incorporate RNN and its variants into traffic prediction models.
A stacked bidirectional and unidirectional LSTM network architecture (SBU-LSTM) is proposed to assist the design of neural network structures for traffic state forecasting.
We also propose a data imputation mechanism in the LSTM structure (LSTM-I) by designing an imputation unit to infer missing values and assist traffic prediction.
arXiv Detail & Related papers (2020-05-24T00:17:15Z) - Depth-Adaptive Graph Recurrent Network for Text Classification [71.20237659479703]
Sentence-State LSTM (S-LSTM) is a powerful and high efficient graph recurrent network.
We propose a depth-adaptive mechanism for the S-LSTM, which allows the model to learn how many computational steps to conduct for different words as required.
arXiv Detail & Related papers (2020-02-29T03:09:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.