Attention-LSTM for Multivariate Traffic State Prediction on Rural Roads
- URL: http://arxiv.org/abs/2301.02731v1
- Date: Fri, 6 Jan 2023 22:23:57 GMT
- Title: Attention-LSTM for Multivariate Traffic State Prediction on Rural Roads
- Authors: Elahe Sherafat and Bilal Farooq and Amir Hossein Karbasi and
Seyedehsan Seyedabrishami
- Abstract summary: An Attention based Long Sort-Term Memory model (A-LSTM) is proposed to simultaneously predict traffic volume and speed.
This study compares the results of the A-LSTM model with the Long Short-Term Memory (LSTM) model.
- Score: 5.259027520298188
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate traffic volume and speed prediction have a wide range of
applications in transportation. It can result in useful and timely information
for both travellers and transportation decision-makers. In this study, an
Attention based Long Sort-Term Memory model (A-LSTM) is proposed to
simultaneously predict traffic volume and speed in a critical rural road
segmentation which connects Tehran to Chalus, the most tourist destination city
in Iran. Moreover, this study compares the results of the A-LSTM model with the
Long Short-Term Memory (LSTM) model. Both models show acceptable performance in
predicting speed and flow. However, the A-LSTM model outperforms the LSTM in 5
and 15-minute intervals. In contrast, there is no meaningful difference between
the two models for the 30-minute time interval. By comparing the performance of
the models based on different time horizons, the 15-minute horizon model
outperforms the others by reaching the lowest Mean Square Error (MSE) loss of
0.0032, followed by the 30 and 5-minutes horizons with 0.004 and 0.0051,
respectively. In addition, this study compares the results of the models based
on two transformations of temporal categorical input variables, one-hot or
cyclic, for the 15-minute time interval. The results demonstrate that both LSTM
and A-LSTM with cyclic feature encoding outperform those with one-hot feature
encoding.
Related papers
- Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - Strada-LLM: Graph LLM for traffic prediction [62.2015839597764]
A considerable challenge in traffic prediction lies in handling the diverse data distributions caused by vastly different traffic conditions.
We propose a graph-aware LLM for traffic prediction that considers proximal traffic information.
We adopt a lightweight approach for efficient domain adaptation when facing new data distributions in few-shot fashion.
arXiv Detail & Related papers (2024-10-28T09:19:29Z) - Prediction of Time and Distance of Trips Using Explainable
Attention-based LSTMs [6.07913759162059]
We propose machine learning solutions to predict the time of future trips and the possible distance the vehicle will travel.
We use long short-term memory (LSTM)-based structures specifically designed to handle multi-dimensional historical data of trip time and distances simultaneously.
Among the proposed methods, the most advanced one, parallel At-LSTM, predicts the next trip's distance and time with 3.99% error margin.
arXiv Detail & Related papers (2023-03-27T10:54:32Z) - GC-GRU-N for Traffic Prediction using Loop Detector Data [5.735035463793008]
We use Seattle loop detector data aggregated over 15 minutes and reframe the problem through space time.
The model ranked second with the fastest inference time and a very close performance to first place (Transformers)
arXiv Detail & Related papers (2022-11-13T06:32:28Z) - End-to-end LSTM based estimation of volcano event epicenter localization [55.60116686945561]
An end-to-end based LSTM scheme is proposed to address the problem of volcano event localization.
LSTM was chosen due to its capability to capture the dynamics of time varying signals.
Results show that the LSTM based architecture provided a success rate, i.e., an error smaller than 1.0Km, equal to 48.5%.
arXiv Detail & Related papers (2021-10-27T17:11:33Z) - Rainfall-Runoff Prediction at Multiple Timescales with a Single Long
Short-Term Memory Network [41.33870234564485]
Long Short-Term Memory Networks (LSTMs) have been applied to daily discharge prediction with remarkable success.
Many practical scenarios, however, require predictions at more granular timescales.
In this study, we propose two Multi-Timescale LSTM (MTS-LSTM) architectures that jointly predict multiple timescales within one model.
We test these models on 516 basins across the continental United States and benchmark against the US National Water Model.
arXiv Detail & Related papers (2020-10-15T17:52:16Z) - Prediction of Traffic Flow via Connected Vehicles [77.11902188162458]
We propose a Short-term Traffic flow Prediction framework so that transportation authorities take early actions to control flow and prevent congestion.
We anticipate flow at future time frames on a target road segment based on historical flow data and innovative features such as real time feeds and trajectory data provided by Connected Vehicles (CV) technology.
We show how this novel approach allows advanced modelling by integrating into the forecasting of flow, the impact of various events that CV realistically encountered on segments along their trajectory.
arXiv Detail & Related papers (2020-07-10T16:00:44Z) - Time Series Analysis and Forecasting of COVID-19 Cases Using LSTM and
ARIMA Models [4.56877715768796]
Coronavirus disease 2019 (COVID-19) is a global public health crisis that has been declared a pandemic by World Health Organization.
This study explores the performance of several Long Short-Term Memory (LSTM) models and Auto-Regressive Integrated Moving Average (ARIMA) model in forecasting the number of confirmed COVID-19 cases.
arXiv Detail & Related papers (2020-06-05T20:07:48Z) - Distributed Fine-Grained Traffic Speed Prediction for Large-Scale
Transportation Networks based on Automatic LSTM Customization and Sharing [0.27528170226206433]
DistPre is a distributed fine-grained traffic speed prediction scheme for large-scale transportation networks.
D DistPre provides time-efficient LSTM customization and accurate fine-grained traffic-speed prediction for large-scale transportation networks.
arXiv Detail & Related papers (2020-05-10T21:24:23Z) - A Generative Learning Approach for Spatio-temporal Modeling in Connected
Vehicular Network [55.852401381113786]
This paper proposes LaMI (Latency Model Inpainting), a novel framework to generate a comprehensive-temporal quality framework for wireless access latency of connected vehicles.
LaMI adopts the idea from image inpainting and synthesizing and can reconstruct the missing latency samples by a two-step procedure.
In particular, it first discovers the spatial correlation between samples collected in various regions using a patching-based approach and then feeds the original and highly correlated samples into a Varienational Autocoder (VAE)
arXiv Detail & Related papers (2020-03-16T03:43:59Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.