Time Series Forecasting with Stacked Long Short-Term Memory Networks
- URL: http://arxiv.org/abs/2011.00697v1
- Date: Mon, 2 Nov 2020 03:09:23 GMT
- Title: Time Series Forecasting with Stacked Long Short-Term Memory Networks
- Authors: Frank Xiao
- Abstract summary: This paper explores the effectiveness of applying stacked LSTM networks in the time series prediction domain, specifically, the traffic volume forecasting.
Being able to predict traffic volume more accurately can result in better planning, thus greatly reduce the operation cost and improve overall efficiency.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Long Short-Term Memory (LSTM) networks are often used to capture temporal
dependency patterns. By stacking multi-layer LSTM networks, it can capture even
more complex patterns. This paper explores the effectiveness of applying
stacked LSTM networks in the time series prediction domain, specifically, the
traffic volume forecasting. Being able to predict traffic volume more
accurately can result in better planning, thus greatly reduce the operation
cost and improve overall efficiency.
Related papers
- Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - Unlocking the Power of LSTM for Long Term Time Series Forecasting [27.245021350821638]
We propose a simple yet efficient algorithm named P-sLSTM built upon sLSTM by incorporating patching and channel independence.
These modifications substantially enhance sLSTM's performance in TSF, achieving state-of-the-art results.
arXiv Detail & Related papers (2024-08-19T13:59:26Z) - Beam Prediction based on Large Language Models [51.45077318268427]
Millimeter-wave (mmWave) communication is promising for next-generation wireless networks but suffers from significant path loss.
Traditional deep learning models, such as long short-term memory (LSTM), enhance beam tracking accuracy however are limited by poor robustness and generalization.
In this letter, we use large language models (LLMs) to improve the robustness of beam prediction.
arXiv Detail & Related papers (2024-08-16T12:40:01Z) - TCGPN: Temporal-Correlation Graph Pre-trained Network for Stock Forecasting [1.864621482724548]
We propose a novel approach called the Temporal-Correlation Graph Pre-trained Network (TCGPN) to address these limitations.
TCGPN utilize Temporal-correlation fusion encoder to get a mixed representation and pre-training method with carefully designed temporal and correlation pre-training tasks.
Experiments are conducted on real stock market data sets CSI300 and CSI500 that exhibit minimal periodicity.
arXiv Detail & Related papers (2024-07-26T05:27:26Z) - Short-Term Multi-Horizon Line Loss Rate Forecasting of a Distribution
Network Using Attention-GCN-LSTM [9.460123100630158]
We propose Attention-GCN-LSTM, a novel method that combines Graph Convolutional Networks (GCN), Long Short-Term Memory (LSTM) and a three-level attention mechanism.
Our model enables accurate forecasting of line loss rates across multiple horizons.
arXiv Detail & Related papers (2023-12-19T06:47:22Z) - Recurrent Neural Networks with more flexible memory: better predictions
than rough volatility [0.0]
We compare the ability of vanilla and extended long short term memory networks to predict asset price volatility.
We show that the model with the smallest validation loss systemically outperforms rough volatility predictions by about 20% when trained and tested on a dataset with multiple time series.
arXiv Detail & Related papers (2023-08-04T14:24:57Z) - Learning Fast and Slow for Online Time Series Forecasting [76.50127663309604]
Fast and Slow learning Networks (FSNet) is a holistic framework for online time-series forecasting.
FSNet balances fast adaptation to recent changes and retrieving similar old knowledge.
Our code will be made publicly available.
arXiv Detail & Related papers (2022-02-23T18:23:07Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Slower is Better: Revisiting the Forgetting Mechanism in LSTM for Slower
Information Decay [4.414729427965163]
We propose a power law forget gate, which learns to forget information along a slower power law decay function.
We show that LSTM with the proposed forget gate can learn long-term dependencies, outperforming other recurrent networks in multiple domains.
arXiv Detail & Related papers (2021-05-12T20:21:16Z) - Object Tracking through Residual and Dense LSTMs [67.98948222599849]
Deep learning-based trackers based on LSTMs (Long Short-Term Memory) recurrent neural networks have emerged as a powerful alternative.
DenseLSTMs outperform Residual and regular LSTM, and offer a higher resilience to nuisances.
Our case study supports the adoption of residual-based RNNs for enhancing the robustness of other trackers.
arXiv Detail & Related papers (2020-06-22T08:20:17Z) - Deep Stock Predictions [58.720142291102135]
We consider the design of a trading strategy that performs portfolio optimization using Long Short Term Memory (LSTM) neural networks.
We then customize the loss function used to train the LSTM to increase the profit earned.
We find the LSTM model with the customized loss function to have an improved performance in the training bot over a regressive baseline such as ARIMA.
arXiv Detail & Related papers (2020-06-08T23:37:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.