Harnessing Synthetic Active Particles for Physical Reservoir Computing
- URL: http://arxiv.org/abs/2307.15010v1
- Date: Thu, 27 Jul 2023 17:08:53 GMT
- Title: Harnessing Synthetic Active Particles for Physical Reservoir Computing
- Authors: Xiangzun Wang, Frank Cichos
- Abstract summary: Reservoir computing is a technique for stimulating a network of nodes with fading memory enabling computations and complex predictions.
Here we demonstrate physical reservoir computing with a synthetic active microparticle system that self-organizes from an active and passive component into inherently noisy nonlinear dynamical units.
Our results pave the way for the study of information processing in synthetic self-organized active particle systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The processing of information is an indispensable property of living systems
realized by networks of active processes with enormous complexity. They have
inspired many variants of modern machine learning one of them being reservoir
computing, in which stimulating a network of nodes with fading memory enables
computations and complex predictions. Reservoirs are implemented on computer
hardware, but also on unconventional physical substrates such as mechanical
oscillators, spins, or bacteria often summarized as physical reservoir
computing. Here we demonstrate physical reservoir computing with a synthetic
active microparticle system that self-organizes from an active and passive
component into inherently noisy nonlinear dynamical units. The
self-organization and dynamical response of the unit is the result of a delayed
propulsion of the microswimmer to a passive target. A reservoir of such units
with a self-coupling via the delayed response can perform predictive tasks
despite the strong noise resulting from Brownian motion of the microswimmers.
To achieve efficient noise suppression, we introduce a special architecture
that uses historical reservoir states for output. Our results pave the way for
the study of information processing in synthetic self-organized active particle
systems.
Related papers
- Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model [55.116403765330084]
Current AIGC methods, such as score-based diffusion, are still deficient in terms of rapidity and efficiency.
We propose a time-continuous and analog in-memory neural differential equation solver for score-based diffusion.
We experimentally validate our solution with 180 nm resistive memory in-memory computing macros.
arXiv Detail & Related papers (2024-04-08T16:34:35Z) - Deep Photonic Reservoir Computer for Speech Recognition [49.1574468325115]
Speech recognition is a critical task in the field of artificial intelligence and has witnessed remarkable advancements.
Deep reservoir computing is energy efficient but exhibits limitations in performance when compared to more resource-intensive machine learning algorithms.
We propose a photonic-based deep reservoir computer and evaluate its effectiveness on different speech recognition tasks.
arXiv Detail & Related papers (2023-12-11T17:43:58Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Machine learning at the mesoscale: a computation-dissipation bottleneck [77.34726150561087]
We study a computation-dissipation bottleneck in mesoscopic systems used as input-output devices.
Our framework sheds light on a crucial compromise between information compression, input-output computation and dynamic irreversibility induced by non-reciprocal interactions.
arXiv Detail & Related papers (2023-07-05T15:46:07Z) - Task Agnostic Metrics for Reservoir Computing [0.0]
Physical reservoir computing is a computational paradigm that enables temporal pattern recognition in physical matter.
The chosen dynamical system must have three desirable properties: non-linearity, complexity, and fading memory.
We show that, in general, systems with lower damping reach higher values in all three performance metrics.
arXiv Detail & Related papers (2021-08-03T13:58:11Z) - Natural quantum reservoir computing for temporal information processing [4.785845498722406]
Reservoir computing is a temporal information processing system that exploits artificial or physical dissipative dynamics.
This paper proposes the use of real superconducting quantum computing devices as the reservoir, where the dissipative property is served by the natural noise added to the quantum bits.
arXiv Detail & Related papers (2021-07-13T01:58:57Z) - Reservoir Stack Machines [77.12475691708838]
Memory-augmented neural networks equip a recurrent neural network with an explicit memory to support tasks that require information storage.
We introduce the reservoir stack machine, a model which can provably recognize all deterministic context-free languages.
Our results show that the reservoir stack machine achieves zero error, even on test sequences longer than the training data.
arXiv Detail & Related papers (2021-05-04T16:50:40Z) - Reservoir Computing with Magnetic Thin Films [35.32223849309764]
New unconventional computing hardware has emerged with the potential to exploit natural phenomena and gain efficiency.
Physical reservoir computing demonstrates this with a variety of unconventional systems.
We perform an initial exploration of three magnetic materials in thin-film geometries via microscale simulation.
arXiv Detail & Related papers (2021-01-29T17:37:17Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z) - Building Reservoir Computing Hardware Using Low Energy-Barrier Magnetics [0.0]
Biologically recurrent neural networks, such as reservoir computers are of interest from a hardware point of view due to the simple learning scheme and deep connections to Kalman filters.
Compact implementation of reservoir computers using such devices may enable building compact, energy-efficient signal processors for standalone or insitu machine cognition in edge devices.
arXiv Detail & Related papers (2020-07-06T14:11:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.