Applied Machine Learning Methods with Long-Short Term Memory Based Recurrent Neural Networks for Multivariate Temperature Prediction
- URL: http://arxiv.org/abs/2503.06278v1
- Date: Sat, 08 Mar 2025 16:52:27 GMT
- Title: Applied Machine Learning Methods with Long-Short Term Memory Based Recurrent Neural Networks for Multivariate Temperature Prediction
- Authors: Bojan Lukić,
- Abstract summary: This paper gives an overview on how to develop a dense and deep neural network for making a time series prediction.<n> Python's development environment Jupyter, extended with the package Keras is used.<n>The results and evaluation of the work show that a weather prediction with deep neural networks can be successful for a short time period.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This paper gives an overview on how to develop a dense and deep neural network for making a time series prediction. First, the history and cornerstones in Artificial Intelligence and Machine Learning will be presented. After a short introduction to the theory of Artificial Intelligence and Machine Learning, the paper will go deeper into the techniques for conducting a time series prediction with different models of neural networks. For this project, Python's development environment Jupyter, extended with the TensorFlow package and deep-learning application Keras is used. The system setup and project framework are explained in more detail before discussing the time series prediction. The main part shows an applied example of time series prediction with weather data. For this work, a deep recurrent neural network with Long Short-Term Memory cells is used to conduct the time series prediction. The results and evaluation of the work show that a weather prediction with deep neural networks can be successful for a short time period. However, there are some drawbacks and limitations with time series prediction, which will be discussed towards the end of the paper.
Related papers
- Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization [74.3339999119713]
We develop a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.<n>Our method first scales and decomposes the input time series, then thresholds and quantizes the wavelet coefficients, and finally pre-trains an autoregressive model to forecast coefficients for the forecast horizon.
arXiv Detail & Related papers (2024-12-06T18:22:59Z) - Predicting Long-horizon Futures by Conditioning on Geometry and Time [49.86180975196375]
We explore the task of generating future sensor observations conditioned on the past.
We leverage the large-scale pretraining of image diffusion models which can handle multi-modality.
We create a benchmark for video prediction on a diverse set of videos spanning indoor and outdoor scenes.
arXiv Detail & Related papers (2024-04-17T16:56:31Z) - A Dynamical Model of Neural Scaling Laws [79.59705237659547]
We analyze a random feature model trained with gradient descent as a solvable model of network training and generalization.
Our theory shows how the gap between training and test loss can gradually build up over time due to repeated reuse of data.
arXiv Detail & Related papers (2024-02-02T01:41:38Z) - Pyramidal Predictive Network: A Model for Visual-frame Prediction Based
on Predictive Coding Theory [1.4610038284393165]
We propose a novel neural network model for the task of visual-frame prediction.
The model is composed of a series of recurrent and convolutional units forming the top-down and bottom-up streams.
It learns to predict future frames in a visual sequence, with ConvLSTMs on each layer in the network making local prediction from top to down.
arXiv Detail & Related papers (2022-08-15T06:28:34Z) - Multivariate Time Series Regression with Graph Neural Networks [0.6124773188525718]
Recent advances in adapting Deep Learning to graphs have shown promising potential in various graph-related tasks.
However, these methods have not been adapted for time series related tasks to a great extent.
In this work, we propose an architecture capable of processing these long sequences in a multivariate time series regression task.
arXiv Detail & Related papers (2022-01-03T16:11:46Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Long Short-term Cognitive Networks [2.2748974006378933]
We present a recurrent neural system named Long Short-term Cognitive Networks (LSTCNs) as a generalisation of the Short-term Cognitive Network (STCN) model.
Our neural system reports small forecasting errors while being up to thousands of times faster than state-of-the-art recurrent models.
arXiv Detail & Related papers (2021-06-30T17:42:09Z) - Evaluation of deep learning models for multi-step ahead time series
prediction [1.3764085113103222]
We present an evaluation study that compares the performance of deep learning models for multi-step ahead time series prediction.
Our deep learning methods compromise of simple recurrent neural networks, long short term memory (LSTM) networks, bidirectional LSTM, encoder-decoder LSTM networks, and convolutional neural networks.
arXiv Detail & Related papers (2021-03-26T04:07:11Z) - SLPC: a VRNN-based approach for stochastic lidar prediction and
completion in autonomous driving [63.87272273293804]
We propose a new LiDAR prediction framework that is based on generative models namely Variational Recurrent Neural Networks (VRNNs)
Our algorithm is able to address the limitations of previous video prediction frameworks when dealing with sparse data by spatially inpainting the depth maps in the upcoming frames.
We present a sparse version of VRNNs and an effective self-supervised training method that does not require any labels.
arXiv Detail & Related papers (2021-02-19T11:56:44Z) - Improved Predictive Deep Temporal Neural Networks with Trend Filtering [22.352437268596674]
We propose a new prediction framework based on deep neural networks and a trend filtering.
We reveal that the predictive performance of deep temporal neural networks improves when the training data is temporally processed by a trend filtering.
arXiv Detail & Related papers (2020-10-16T08:29:36Z) - ProphetNet: Predicting Future N-gram for Sequence-to-Sequence
Pre-training [85.35910219651572]
We present a new sequence-to-sequence pre-training model called ProphetNet.
It introduces a novel self-supervised objective named future n-gram prediction.
We conduct experiments on CNN/DailyMail, Gigaword, and SQuAD 1.1 benchmarks for abstractive summarization and question generation tasks.
arXiv Detail & Related papers (2020-01-13T05:12:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.