A theoretical framework for reservoir computing on networks of organic electrochemical transistors
- URL: http://arxiv.org/abs/2408.09223v1
- Date: Sat, 17 Aug 2024 15:04:09 GMT
- Title: A theoretical framework for reservoir computing on networks of organic electrochemical transistors
- Authors: Nicholas W. Landry, Beckett R. Hyde, Jake C. Perez, Sean E. Shaheen, Juan G. Restrepo,
- Abstract summary: Organic electrochemical transistors (OECTs) are physical devices with nonlinear transient properties.
We present a theoretical framework for simulating reservoir computers using OECTs as the non-linear units.
We show that such an implementation can accurately predict the Lorenz attractor with comparable performance to standard reservoir computer implementations.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Efficient and accurate prediction of physical systems is important even when the rules of those systems cannot be easily learned. Reservoir computing, a type of recurrent neural network with fixed nonlinear units, is one such prediction method and is valued for its ease of training. Organic electrochemical transistors (OECTs) are physical devices with nonlinear transient properties that can be used as the nonlinear units of a reservoir computer. We present a theoretical framework for simulating reservoir computers using OECTs as the non-linear units as a test bed for designing physical reservoir computers. We present a proof of concept demonstrating that such an implementation can accurately predict the Lorenz attractor with comparable performance to standard reservoir computer implementations. We explore the effect of operating parameters and find that the prediction performance strongly depends on the pinch-off voltage of the OECTs.
Related papers
- Tight Stability, Convergence, and Robustness Bounds for Predictive Coding Networks [60.3634789164648]
Energy-based learning algorithms, such as predictive coding (PC), have garnered significant attention in the machine learning community.
We rigorously analyze the stability, robustness, and convergence of PC through the lens of dynamical systems theory.
arXiv Detail & Related papers (2024-10-07T02:57:26Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Bose Einstein condensate as nonlinear block of a Machine Learning
pipeline [0.7695660509846216]
We show how to embed the nonlinear evolution of a quantum gas in a Machine Learning pipeline.
We demonstrate successful regression and condensate of a nonlinear function using a quasi one-dimensional cloud of potassium atoms.
arXiv Detail & Related papers (2023-04-28T15:26:18Z) - ETLP: Event-based Three-factor Local Plasticity for online learning with
neuromorphic hardware [105.54048699217668]
We show a competitive performance in accuracy with a clear advantage in the computational complexity for Event-Based Three-factor Local Plasticity (ETLP)
We also show that when using local plasticity, threshold adaptation in spiking neurons and a recurrent topology are necessary to learntemporal patterns with a rich temporal structure.
arXiv Detail & Related papers (2023-01-19T19:45:42Z) - Physics-informed machine learning with differentiable programming for
heterogeneous underground reservoir pressure management [64.17887333976593]
Avoiding over-pressurization in subsurface reservoirs is critical for applications like CO2 sequestration and wastewater injection.
Managing the pressures by controlling injection/extraction are challenging because of complex heterogeneity in the subsurface.
We use differentiable programming with a full-physics model and machine learning to determine the fluid extraction rates that prevent over-pressurization.
arXiv Detail & Related papers (2022-06-21T20:38:13Z) - Optimal reservoir computers for forecasting systems of nonlinear
dynamics [0.0]
We show that reservoirs of low connectivity perform better than or as well as those of high connectivity in forecasting noiseless Lorenz and coupled Wilson-Cowan systems.
We also show that, unexpectedly, computationally effective reservoirs of unconnected nodes (RUN) outperform reservoirs of linked network topologies in predicting these systems.
arXiv Detail & Related papers (2022-02-09T09:36:31Z) - Physical reservoir computing using finitely-sampled quantum systems [0.0]
Reservoir computing exploits the nonlinear dynamics of a physical reservoir to perform complex time-series processing tasks.
Here we describe a framework for reservoir computing with nonlinear quantum reservoirs under continuous measurement.
arXiv Detail & Related papers (2021-10-26T16:46:14Z) - The Computational Capacity of LRC, Memristive and Hybrid Reservoirs [1.657441317977376]
Reservoir computing is a machine learning paradigm that uses a high-dimensional dynamical system, or emphreservoir, to approximate and predict time series data.
We analyze the feasibility and optimal design of electronic reservoirs that include both linear elements (resistors, inductors, and capacitors) and nonlinear memory elements called memristors.
Our electronic reservoirs can match or exceed the performance of conventional "echo state network" reservoirs in a form that may be directly implemented in hardware.
arXiv Detail & Related papers (2020-08-31T21:24:45Z) - OrbNet: Deep Learning for Quantum Chemistry Using Symmetry-Adapted
Atomic-Orbital Features [42.96944345045462]
textscOrbNet is shown to outperform existing methods in terms of learning efficiency and transferability.
For applications to datasets of drug-like molecules, textscOrbNet predicts energies within chemical accuracy of DFT at a computational cost that is thousand-fold or more reduced.
arXiv Detail & Related papers (2020-07-15T22:38:41Z) - Predictive Coding Approximates Backprop along Arbitrary Computation
Graphs [68.8204255655161]
We develop a strategy to translate core machine learning architectures into their predictive coding equivalents.
Our models perform equivalently to backprop on challenging machine learning benchmarks.
Our method raises the potential that standard machine learning algorithms could in principle be directly implemented in neural circuitry.
arXiv Detail & Related papers (2020-06-07T15:35:47Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.