Hybridizing Traditional and Next-Generation Reservoir Computing to Accurately and Efficiently Forecast Dynamical Systems
- URL: http://arxiv.org/abs/2403.18953v2
- Date: Wed, 5 Jun 2024 21:49:03 GMT
- Title: Hybridizing Traditional and Next-Generation Reservoir Computing to Accurately and Efficiently Forecast Dynamical Systems
- Authors: Ravi Chepuri, Dael Amzalag, Thomas Antonsen Jr., Michelle Girvan,
- Abstract summary: Reservoir computers (RCs) are powerful machine learning architectures for time series prediction.
Next generation reservoir computers (NGRCs) have been introduced, offering distinct advantages over RCs.
Here, we introduce a hybrid RC-NGRC approach for time series forecasting of dynamical systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reservoir computers (RCs) are powerful machine learning architectures for time series prediction. Recently, next generation reservoir computers (NGRCs) have been introduced, offering distinct advantages over RCs, such as reduced computational expense and lower training data requirements. However, NGRCs have their own practical difficulties, including sensitivity to sampling time and type of nonlinearities in the data. Here, we introduce a hybrid RC-NGRC approach for time series forecasting of dynamical systems. We show that our hybrid approach can produce accurate short term predictions and capture the long term statistics of chaotic dynamical systems in situations where the RC and NGRC components alone are insufficient, e.g., due to constraints from limited computational resources, sub-optimal hyperparameters, sparsely-sampled training data, etc. Under these conditions, we show for multiple model chaotic systems that the hybrid RC-NGRC method with a small reservoir can achieve prediction performance approaching that of a traditional RC with a much larger reservoir, illustrating that the hybrid approach can offer significant gains in computational efficiency over traditional RCs while simultaneously addressing some of the limitations of NGRCs. Our results suggest that hybrid RC-NGRC approach may be particularly beneficial in cases when computational efficiency is a high priority and an NGRC alone is not adequate.
Related papers
- Oscillations enhance time-series prediction in reservoir computing with feedback [3.3686252536891454]
Reservoir computing is a machine learning framework used for modeling the brain.
It is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable.
This study proposes oscillation-driven reservoir computing (ODRC) with feedback.
arXiv Detail & Related papers (2024-06-05T02:30:29Z) - Dynamic Scheduling for Federated Edge Learning with Streaming Data [56.91063444859008]
We consider a Federated Edge Learning (FEEL) system where training data are randomly generated over time at a set of distributed edge devices with long-term energy constraints.
Due to limited communication resources and latency requirements, only a subset of devices is scheduled for participating in the local training process in every iteration.
arXiv Detail & Related papers (2023-05-02T07:41:16Z) - Embedding Theory of Reservoir Computing and Reducing Reservoir Network
Using Time Delays [6.543793376734818]
Reservoir computing (RC) is under explosive development due to its exceptional efficacy and high performance in reconstruction or/and prediction of complex physical systems.
Here, we rigorously prove that RC is essentially a high dimensional embedding of the original input nonlinear dynamical system.
We significantly reduce the network size of RC for reconstructing and predicting some representative physical systems, and, more surprisingly, only using a single neuron reservoir with time delays is sometimes sufficient for achieving those tasks.
arXiv Detail & Related papers (2023-03-16T02:25:51Z) - Catch-22s of reservoir computing [0.0]
Reservoir Computing is a simple and efficient framework for forecasting the behavior of nonlinear dynamical systems from data.
We focus on the important problem of basin prediction -- determining which attractor a system will converge to from its initial conditions.
By incorporating the exact nonlinearities in the original equations, we show that NGRC can accurately reconstruct intricate and high-dimensional basins of attraction.
arXiv Detail & Related papers (2022-10-18T23:31:15Z) - When does return-conditioned supervised learning work for offline
reinforcement learning? [51.899892382786526]
We study the capabilities and limitations of return-conditioned supervised learning.
We find that RCSL returns the optimal policy under a set of assumptions stronger than those needed for the more traditional dynamic programming-based algorithms.
arXiv Detail & Related papers (2022-06-02T15:05:42Z) - Distributionally Robust Models with Parametric Likelihood Ratios [123.05074253513935]
Three simple ideas allow us to train models with DRO using a broader class of parametric likelihood ratios.
We find that models trained with the resulting parametric adversaries are consistently more robust to subpopulation shifts when compared to other DRO approaches.
arXiv Detail & Related papers (2022-04-13T12:43:12Z) - A Systematic Exploration of Reservoir Computing for Forecasting Complex
Spatiotemporal Dynamics [0.0]
Reservoir computer (RC) is a type of recurrent neural network that has demonstrated success in prediction architecture of intrinsicly chaotic dynamical systems.
We explore the architecture and design choices for a "best in class" RC for a number of characteristic dynamical systems.
We show the application of these choices in scaling up to larger models using localization.
arXiv Detail & Related papers (2022-01-21T22:31:12Z) - Efficient Micro-Structured Weight Unification and Pruning for Neural
Network Compression [56.83861738731913]
Deep Neural Network (DNN) models are essential for practical applications, especially for resource limited devices.
Previous unstructured or structured weight pruning methods can hardly truly accelerate inference.
We propose a generalized weight unification framework at a hardware compatible micro-structured level to achieve high amount of compression and acceleration.
arXiv Detail & Related papers (2021-06-15T17:22:59Z) - Combining Deep Learning and Optimization for Security-Constrained
Optimal Power Flow [94.24763814458686]
Security-constrained optimal power flow (SCOPF) is fundamental in power systems.
Modeling of APR within the SCOPF problem results in complex large-scale mixed-integer programs.
This paper proposes a novel approach that combines deep learning and robust optimization techniques.
arXiv Detail & Related papers (2020-07-14T12:38:21Z) - Model-Size Reduction for Reservoir Computing by Concatenating Internal
States Through Time [2.6872737601772956]
Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly.
To implement RC in edge computing, it is highly important to reduce the amount of computational resources that RC requires.
We propose methods that reduce the size of the reservoir by inputting the past or drifting states of the reservoir to the output layer at the current time step.
arXiv Detail & Related papers (2020-06-11T06:11:03Z) - Combining Machine Learning with Knowledge-Based Modeling for Scalable
Forecasting and Subgrid-Scale Closure of Large, Complex, Spatiotemporal
Systems [48.7576911714538]
We attempt to utilize machine learning as the essential tool for integrating pasttemporal data into predictions.
We propose combining two approaches: (i) a parallel machine learning prediction scheme; and (ii) a hybrid technique, for a composite prediction system composed of a knowledge-based component and a machine-learning-based component.
We demonstrate that not only can this method combining (i) and (ii) be scaled to give excellent performance for very large systems, but also that the length of time series data needed to train our multiple, parallel machine learning components is dramatically less than that necessary without parallelization.
arXiv Detail & Related papers (2020-02-10T23:21:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.