Deep State Space Recurrent Neural Networks for Time Series Forecasting
- URL: http://arxiv.org/abs/2407.15236v1
- Date: Sun, 21 Jul 2024 17:59:27 GMT
- Title: Deep State Space Recurrent Neural Networks for Time Series Forecasting
- Authors: Hugo Inzirillo,
- Abstract summary: This paper introduces novel neural network framework that blend the principles of econometric state space models with the dynamic capabilities of Recurrent Neural Networks (RNNs)
According to the results, TKANs, inspired by Kolmogorov-Arnold Networks (KANs) and LSTM, demonstrate promising outcomes.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We explore various neural network architectures for modeling the dynamics of the cryptocurrency market. Traditional linear models often fall short in accurately capturing the unique and complex dynamics of this market. In contrast, Deep Neural Networks (DNNs) have demonstrated considerable proficiency in time series forecasting. This papers introduces novel neural network framework that blend the principles of econometric state space models with the dynamic capabilities of Recurrent Neural Networks (RNNs). We propose state space models using Long Short Term Memory (LSTM), Gated Residual Units (GRU) and Temporal Kolmogorov-Arnold Networks (TKANs). According to the results, TKANs, inspired by Kolmogorov-Arnold Networks (KANs) and LSTM, demonstrate promising outcomes.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Hybridization of Persistent Homology with Neural Networks for Time-Series Prediction: A Case Study in Wave Height [0.0]
We introduce a feature engineering method that enhances the predictive performance of neural network models.
Specifically, we leverage computational topology techniques to derive valuable topological features from input data.
For time-ahead predictions, the enhancements in $R2$ score were significant for FNNs, RNNs, LSTM, and GRU models.
arXiv Detail & Related papers (2024-09-03T01:26:21Z) - CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models [19.04709216497077]
We develop spiking state space models (SpikingSSMs) for long sequence learning.
Inspired by dendritic neuron structure, we hierarchically integrate neuronal dynamics with the original SSM block.
We propose a light-weight surrogate dynamic network which accurately predicts the after-reset membrane potential and compatible to learnable thresholds.
arXiv Detail & Related papers (2024-08-27T09:35:49Z) - TKAN: Temporal Kolmogorov-Arnold Networks [0.0]
Long Short-Term Memory (LSTM) has demonstrated its ability to capture long-term dependencies in sequential data.
Inspired by the Kolmogorov-Arnold Networks (KANs) a promising alternatives to Multi-Layer Perceptrons (MLPs)
We propose a new neural networks architecture inspired by KAN and the LSTM, the Temporal Kolomogorov-Arnold Networks (TKANs)
arXiv Detail & Related papers (2024-05-12T17:40:48Z) - VMRNN: Integrating Vision Mamba and LSTM for Efficient and Accurate Spatiotemporal Forecasting [11.058879849373572]
ViTs or CNNs with RNNs fortemporal forecasting have unparalleled results in predicting temporal and spatial dynamics.
Recent Mamba-based architecture has been met with enthusiasm for their exceptional long-sequence modeling capabilities.
We propose the VMRNN cell, a recurrent unit that integrates the strengths of Vision Mamba blocks with LSTM.
arXiv Detail & Related papers (2024-03-25T08:26:42Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Spatio-Temporal Neural Network for Fitting and Forecasting COVID-19 [1.1129587851149594]
We established a Spatio-Temporal Neural Network, namely STNN, to forecast the spread of the coronavirus COVID-19 outbreak worldwide in 2020.
Two improved STNN architectures, namely the STNN with Augmented Spatial States (STNN-A) and the STNN with Input Gate (STNN-I), are proposed.
Numerical simulations demonstrate that STNN models outperform many others by providing more accurate fitting and prediction, and by handling both spatial and temporal data.
arXiv Detail & Related papers (2021-03-22T13:59:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.