Effect of temporal resolution on the reproduction of chaotic dynamics
via reservoir computing
- URL: http://arxiv.org/abs/2302.10761v2
- Date: Wed, 22 Feb 2023 06:08:10 GMT
- Title: Effect of temporal resolution on the reproduction of chaotic dynamics
via reservoir computing
- Authors: Kohei Tsuchiyama, Andr\'e R\"ohm, Takatomo Mihana, Ryoichi Horisaki,
Makoto Naruse
- Abstract summary: Reservoir computing is a machine learning paradigm that uses a structure called a reservoir, which has nonlinearities and short-term memory.
This study analyzes the effect of sampling on the ability of reservoir computing to autonomously regenerate chaotic time series.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing is a machine learning paradigm that uses a structure
called a reservoir, which has nonlinearities and short-term memory. In recent
years, reservoir computing has expanded to new functions such as the autonomous
generation of chaotic time series, as well as time series prediction and
classification. Furthermore, novel possibilities have been demonstrated, such
as inferring the existence of previously unseen attractors. Sampling, in
contrast, has a strong influence on such functions. Sampling is indispensable
in a physical reservoir computer that uses an existing physical system as a
reservoir because the use of an external digital system for the data input is
usually inevitable. This study analyzes the effect of sampling on the ability
of reservoir computing to autonomously regenerate chaotic time series. We
found, as expected, that excessively coarse sampling degrades the system
performance, but also that excessively dense sampling is unsuitable. Based on
quantitative indicators that capture the local and global characteristics of
attractors, we identify a suitable window of the sampling frequency and discuss
its underlying mechanisms.
Related papers
- Oscillations enhance time-series prediction in reservoir computing with feedback [3.3686252536891454]
Reservoir computing is a machine learning framework used for modeling the brain.
It is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable.
This study proposes oscillation-driven reservoir computing (ODRC) with feedback.
arXiv Detail & Related papers (2024-06-05T02:30:29Z) - Stochastic Reservoir Computers [0.0]
In reservoir computing, the number of distinct states of the entire reservoir computer can potentially scale exponentially with the size of the reservoir hardware.
While shot noise is a limiting factor in the performance of reservoir computing, we show significantly improved performance compared to a reservoir computer with similar hardware in cases where the effects of noise are small.
arXiv Detail & Related papers (2024-05-20T21:26:00Z) - Chaotic attractor reconstruction using small reservoirs - the influence
of topology [0.0]
Reservoir computing has been shown to be an effective method of forecasting chaotic dynamics.
We show that a reservoir of uncoupled nodes more reliably produces long term timeseries predictions.
arXiv Detail & Related papers (2024-02-23T09:43:52Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Inferring Attracting Basins of Power System with Machine Learning [5.83843172320071]
We propose a new machine learning technique, namely balanced reservoir computing, to infer the attracting basins of a typical power system.
We demonstrate that the trained machine can predict accurately whether the system will return to the functional state in response to a large, random perturbation.
arXiv Detail & Related papers (2023-05-20T08:42:29Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Natural quantum reservoir computing for temporal information processing [4.785845498722406]
Reservoir computing is a temporal information processing system that exploits artificial or physical dissipative dynamics.
This paper proposes the use of real superconducting quantum computing devices as the reservoir, where the dissipative property is served by the natural noise added to the quantum bits.
arXiv Detail & Related papers (2021-07-13T01:58:57Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Automatic Recall Machines: Internal Replay, Continual Learning and the
Brain [104.38824285741248]
Replay in neural networks involves training on sequential data with memorized samples, which counteracts forgetting of previous behavior caused by non-stationarity.
We present a method where these auxiliary samples are generated on the fly, given only the model that is being trained for the assessed objective.
Instead the implicit memory of learned samples within the assessed model itself is exploited.
arXiv Detail & Related papers (2020-06-22T15:07:06Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.