Oscillations enhance time-series prediction in reservoir computing with feedback
- URL: http://arxiv.org/abs/2406.02867v1
- Date: Wed, 5 Jun 2024 02:30:29 GMT
- Title: Oscillations enhance time-series prediction in reservoir computing with feedback
- Authors: Yuji Kawai, Takashi Morita, Jihoon Park, Minoru Asada,
- Abstract summary: Reservoir computing is a machine learning framework used for modeling the brain.
It is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable.
This study proposes oscillation-driven reservoir computing (ODRC) with feedback.
- Score: 3.3686252536891454
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing, a machine learning framework used for modeling the brain, can predict temporal data with little observations and minimal computational resources. However, it is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable. This predictive capability is required for a wide variety of time-series processing, including predictions of motor timing and chaotic dynamical systems. This study proposes oscillation-driven reservoir computing (ODRC) with feedback, where oscillatory signals are fed into a reservoir network to stabilize the network activity and induce complex reservoir dynamics. The ODRC can reproduce long-term target time series more accurately than conventional reservoir computing methods in a motor timing and chaotic time-series prediction tasks. Furthermore, it generates a time series similar to the target in the unexperienced period, that is, it can learn the abstract generative rules from limited observations. Given these significant improvements made by the simple and computationally inexpensive implementation, the ODRC would serve as a practical model of various time series data. Moreover, we will discuss biological implications of the ODRC, considering it as a model of neural oscillations and their cerebellar processors.
Related papers
- A novel Reservoir Architecture for Periodic Time Series Prediction [4.7368661961661775]
This paper introduces a novel approach to predicting periodic time series using reservoir computing.
The model is tailored to deliver precise forecasts of rhythms, a crucial aspect for tasks such as generating musical rhythm.
Our network accurately predicts rhythmic signals within the human frequency perception range.
arXiv Detail & Related papers (2024-05-16T13:55:53Z) - Chaotic attractor reconstruction using small reservoirs - the influence
of topology [0.0]
Reservoir computing has been shown to be an effective method of forecasting chaotic dynamics.
We show that a reservoir of uncoupled nodes more reliably produces long term timeseries predictions.
arXiv Detail & Related papers (2024-02-23T09:43:52Z) - Reduced-order modeling of unsteady fluid flow using neural network ensembles [0.0]
We propose using bagging, a commonly used ensemble learning technique, to develop a fully data-driven reduced-order model framework.
The framework uses CAEs for spatial reconstruction of the full-order model and LSTM ensembles for time-series prediction.
Results show that the presented framework effectively reduces error propagation and leads to more accurate time-series prediction of latent variables at unseen points.
arXiv Detail & Related papers (2024-02-08T03:02:59Z) - Gated Recurrent Neural Networks with Weighted Time-Delay Feedback [59.125047512495456]
We introduce a novel gated recurrent unit (GRU) with a weighted time-delay feedback mechanism.
We show that $tau$-GRU can converge faster and generalize better than state-of-the-art recurrent units and gated recurrent architectures.
arXiv Detail & Related papers (2022-12-01T02:26:34Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - A Systematic Exploration of Reservoir Computing for Forecasting Complex
Spatiotemporal Dynamics [0.0]
Reservoir computer (RC) is a type of recurrent neural network that has demonstrated success in prediction architecture of intrinsicly chaotic dynamical systems.
We explore the architecture and design choices for a "best in class" RC for a number of characteristic dynamical systems.
We show the application of these choices in scaling up to larger models using localization.
arXiv Detail & Related papers (2022-01-21T22:31:12Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Model-Size Reduction for Reservoir Computing by Concatenating Internal
States Through Time [2.6872737601772956]
Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly.
To implement RC in edge computing, it is highly important to reduce the amount of computational resources that RC requires.
We propose methods that reduce the size of the reservoir by inputting the past or drifting states of the reservoir to the output layer at the current time step.
arXiv Detail & Related papers (2020-06-11T06:11:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.