Deep State Space Recurrent Neural Networks for Time Series Forecasting
- URL: http://arxiv.org/abs/2407.15236v1
- Date: Sun, 21 Jul 2024 17:59:27 GMT
- Title: Deep State Space Recurrent Neural Networks for Time Series Forecasting
- Authors: Hugo Inzirillo,
- Abstract summary: This paper introduces novel neural network framework that blend the principles of econometric state space models with the dynamic capabilities of Recurrent Neural Networks (RNNs)
According to the results, TKANs, inspired by Kolmogorov-Arnold Networks (KANs) and LSTM, demonstrate promising outcomes.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We explore various neural network architectures for modeling the dynamics of the cryptocurrency market. Traditional linear models often fall short in accurately capturing the unique and complex dynamics of this market. In contrast, Deep Neural Networks (DNNs) have demonstrated considerable proficiency in time series forecasting. This papers introduces novel neural network framework that blend the principles of econometric state space models with the dynamic capabilities of Recurrent Neural Networks (RNNs). We propose state space models using Long Short Term Memory (LSTM), Gated Residual Units (GRU) and Temporal Kolmogorov-Arnold Networks (TKANs). According to the results, TKANs, inspired by Kolmogorov-Arnold Networks (KANs) and LSTM, demonstrate promising outcomes.
Related papers
- Rethinking Spiking Neural Networks as State Space Models [1.9775291915550175]
Spiking neural networks (SNNs) are posited as a biologically plausible alternative to conventional neural architectures.
We present a novel class of spiking neuronal model grounded in state space models.
Our models attain state-of-the-art performance among SNN models across diverse long-range dependency tasks.
arXiv Detail & Related papers (2024-06-05T04:23:11Z) - TKAN: Temporal Kolmogorov-Arnold Networks [0.0]
Long Short-Term Memory (LSTM) has demonstrated its ability to capture long-term dependencies in sequential data.
Inspired by the Kolmogorov-Arnold Networks (KANs) a promising alternatives to Multi-Layer Perceptrons (MLPs)
We propose a new neural networks architecture inspired by KAN and the LSTM, the Temporal Kolomogorov-Arnold Networks (TKANs)
arXiv Detail & Related papers (2024-05-12T17:40:48Z) - VMRNN: Integrating Vision Mamba and LSTM for Efficient and Accurate Spatiotemporal Forecasting [11.058879849373572]
ViTs or CNNs with RNNs fortemporal forecasting have unparalleled results in predicting temporal and spatial dynamics.
Recent Mamba-based architecture has been met with enthusiasm for their exceptional long-sequence modeling capabilities.
We propose the VMRNN cell, a recurrent unit that integrates the strengths of Vision Mamba blocks with LSTM.
arXiv Detail & Related papers (2024-03-25T08:26:42Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Spatio-Temporal Neural Network for Fitting and Forecasting COVID-19 [1.1129587851149594]
We established a Spatio-Temporal Neural Network, namely STNN, to forecast the spread of the coronavirus COVID-19 outbreak worldwide in 2020.
Two improved STNN architectures, namely the STNN with Augmented Spatial States (STNN-A) and the STNN with Input Gate (STNN-I), are proposed.
Numerical simulations demonstrate that STNN models outperform many others by providing more accurate fitting and prediction, and by handling both spatial and temporal data.
arXiv Detail & Related papers (2021-03-22T13:59:14Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.