Chaotic attractor reconstruction using small reservoirs - the influence
of topology
- URL: http://arxiv.org/abs/2402.16888v1
- Date: Fri, 23 Feb 2024 09:43:52 GMT
- Title: Chaotic attractor reconstruction using small reservoirs - the influence
of topology
- Authors: Lina Jaurigue
- Abstract summary: Reservoir computing has been shown to be an effective method of forecasting chaotic dynamics.
We show that a reservoir of uncoupled nodes more reliably produces long term timeseries predictions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Forecasting timeseries based upon measured data is needed in a wide range of
applications and has been the subject of extensive research. A particularly
challenging task is the forecasting of timeseries generated by chaotic
dynamics. In recent years reservoir computing has been shown to be an effective
method of forecasting chaotic dynamics and reconstructing chaotic attractors
from data. In this work strides are made toward smaller and lower complexity
reservoirs with the goal of improved hardware implementability and more
reliable production of adequate surrogate models. We show that a reservoir of
uncoupled nodes more reliably produces long term timeseries predictions than
complex reservoir topologies. We then link the improved attractor
reconstruction of the uncoupled reservoir with smaller spectral radii of the
resulting surrogate systems. These results indicate that, the node degree plays
an important role in determining whether the desired dynamics will be stable in
the autonomous surrogate system which is attained via closed-loop operation of
the trained reservoir. In terms of hardware implementability, uncoupled nodes
would allow for greater freedom in the hardware architecture because no complex
coupling setups are needed and because, for uncoupled nodes, the system
response is equivalent for space and time multiplexing.
Related papers
- Oscillations enhance time-series prediction in reservoir computing with feedback [3.3686252536891454]
Reservoir computing is a machine learning framework used for modeling the brain.
It is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable.
This study proposes oscillation-driven reservoir computing (ODRC) with feedback.
arXiv Detail & Related papers (2024-06-05T02:30:29Z) - RefreshNet: Learning Multiscale Dynamics through Hierarchical Refreshing [0.0]
"refreshing" mechanism in RefreshNet allows coarser blocks to reset inputs of finer blocks, effectively controlling and alleviating error accumulation.
"refreshing" mechanism in RefreshNet allows coarser blocks to reset inputs of finer blocks, effectively controlling and alleviating error accumulation.
arXiv Detail & Related papers (2024-01-24T07:47:01Z) - Hybrid quantum-classical reservoir computing for simulating chaotic systems [2.4995929091995857]
This work presents a hybrid quantum reservoir-computing framework, which replaces the quantum reservoir in RC with a quantum circuit circuit.
The noiseless simulations of HQRC demonstrate valid prediction times comparable to state-of-the-art classical RC models.
arXiv Detail & Related papers (2023-11-23T17:07:02Z) - Long-term Wind Power Forecasting with Hierarchical Spatial-Temporal
Transformer [112.12271800369741]
Wind power is attracting increasing attention around the world due to its renewable, pollution-free, and other advantages.
Accurate wind power forecasting (WPF) can effectively reduce power fluctuations in power system operations.
Existing methods are mainly designed for short-term predictions and lack effective spatial-temporal feature augmentation.
arXiv Detail & Related papers (2023-05-30T04:03:15Z) - Effect of temporal resolution on the reproduction of chaotic dynamics
via reservoir computing [0.0]
Reservoir computing is a machine learning paradigm that uses a structure called a reservoir, which has nonlinearities and short-term memory.
This study analyzes the effect of sampling on the ability of reservoir computing to autonomously regenerate chaotic time series.
arXiv Detail & Related papers (2023-01-27T13:31:15Z) - Gated Recurrent Neural Networks with Weighted Time-Delay Feedback [59.125047512495456]
We introduce a novel gated recurrent unit (GRU) with a weighted time-delay feedback mechanism.
We show that $tau$-GRU can converge faster and generalize better than state-of-the-art recurrent units and gated recurrent architectures.
arXiv Detail & Related papers (2022-12-01T02:26:34Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting [68.86835407617778]
Autoformer is a novel decomposition architecture with an Auto-Correlation mechanism.
In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a relative improvement on six benchmarks.
arXiv Detail & Related papers (2021-06-24T13:43:43Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Multioutput Gaussian Processes with Functional Data: A Study on Coastal
Flood Hazard Assessment [0.0]
We introduce a surrogate model that accounts for time-varying inputs and provides information on spatially varying inland flooding.
In several experiments, we demonstrate the versatility of the model for both learning maps and inferring unobserved maps.
We conclude that our framework is a promising approach for forecast and early-warning systems.
arXiv Detail & Related papers (2020-07-28T08:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.