Nonlinear Autoregression with Convergent Dynamics on Novel Computational
Platforms
- URL: http://arxiv.org/abs/2108.08001v1
- Date: Wed, 18 Aug 2021 07:01:16 GMT
- Title: Nonlinear Autoregression with Convergent Dynamics on Novel Computational
Platforms
- Authors: J. Chen and H. I. Nurdin
- Abstract summary: Reservoir computing exploits nonlinear dynamical systems for temporal information processing.
This paper introduces reservoir computers with output feedback as stationary and ergodic infinite-order nonlinear autoregressive models.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nonlinear stochastic modeling is useful for describing complex engineering
systems. Meanwhile, neuromorphic (brain-inspired) computing paradigms are
developing to tackle tasks that are challenging and resource intensive on
digital computers. An emerging scheme is reservoir computing which exploits
nonlinear dynamical systems for temporal information processing. This paper
introduces reservoir computers with output feedback as stationary and ergodic
infinite-order nonlinear autoregressive models. We highlight the versatility of
this approach by employing classical and quantum reservoir computers to model
synthetic and real data sets, further exploring their potential for control
applications.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Extrapolating tipping points and simulating non-stationary dynamics of
complex systems using efficient machine learning [2.44755919161855]
We propose a novel, fully data-driven machine learning algorithm based on next-generation reservoir computing to extrapolate the bifurcation behavior of nonlinear dynamical systems.
In doing so, post-tipping point dynamics of unseen parameter regions can be simulated.
arXiv Detail & Related papers (2023-12-11T10:37:28Z) - Generative learning for nonlinear dynamics [7.6146285961466]
generative machine learning models create realistic outputs far beyond their training data.
These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions.
We aim to connect these classical works to emerging themes in large-scale generative statistical learning.
arXiv Detail & Related papers (2023-11-07T16:53:56Z) - CoDBench: A Critical Evaluation of Data-driven Models for Continuous
Dynamical Systems [8.410938527671341]
We introduce CodBench, an exhaustive benchmarking suite comprising 11 state-of-the-art data-driven models for solving differential equations.
Specifically, we evaluate 4 distinct categories of models, viz., feed forward neural networks, deep operator regression models, frequency-based neural operators, and transformer architectures.
We conduct extensive experiments, assessing the operators' capabilities in learning, zero-shot super-resolution, data efficiency, robustness to noise, and computational efficiency.
arXiv Detail & Related papers (2023-10-02T21:27:54Z) - Machine Learning with Chaotic Strange Attractors [0.0]
We present an analog computing method that harnesses chaotic nonlinear attractors to perform machine learning tasks with low power consumption.
Inspired by neuromorphic computing, our model is a programmable, versatile, and generalized platform for machine learning tasks.
When deployed as a simple analog device, it only requires milliwatt-scale power levels while being on par with current machine learning techniques.
arXiv Detail & Related papers (2023-09-23T12:54:38Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Continual Learning of Dynamical Systems with Competitive Federated
Reservoir Computing [29.98127520773633]
Continual learning aims to rapidly adapt to abrupt system changes without previous dynamical regimes.
This work proposes an approach to continual learning based reservoir computing.
We show that this multi-head reservoir minimizes interference and forgetting on several dynamical systems.
arXiv Detail & Related papers (2022-06-27T14:35:50Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.