Seeing double with a multifunctional reservoir computer
- URL: http://arxiv.org/abs/2305.05799v2
- Date: Thu, 19 Oct 2023 09:16:41 GMT
- Title: Seeing double with a multifunctional reservoir computer
- Authors: Andrew Flynn, Vassilios A. Tsachouridis, Andreas Amann
- Abstract summary: Multifunctional biological neural networks exploit multistability in order to perform multiple tasks without changing any network properties.
We study how a reservoir computer reconstructs a coexistence of attractors when there is an overlap between them.
A bifurcation analysis reveals how multifunctionality emerges and is destroyed as the RC enters a chaotic regime.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multifunctional biological neural networks exploit multistability in order to
perform multiple tasks without changing any network properties. Enabling
artificial neural networks (ANNs) to obtain certain multistabilities in order
to perform several tasks, where each task is related to a particular attractor
in the network's state space, naturally has many benefits from a machine
learning perspective. Given the association to multistability, in this paper we
explore how the relationship between different attractors influences the
ability of a reservoir computer (RC), which is a dynamical system in the form
of an ANN, to achieve multifunctionality. We construct the `seeing double'
problem to systematically study how a RC reconstructs a coexistence of
attractors when there is an overlap between them. As the amount of overlap
increases, we discover that for multifunctionality to occur, there is a
critical dependence on a suitable choice of the spectral radius for the RC's
internal network connections. A bifurcation analysis reveals how
multifunctionality emerges and is destroyed as the RC enters a chaotic regime
that can lead to chaotic itinerancy.
Related papers
- Renormalized Connection for Scale-preferred Object Detection in Satellite Imagery [51.83786195178233]
We design a Knowledge Discovery Network (KDN) to implement the renormalization group theory in terms of efficient feature extraction.
Renormalized connection (RC) on the KDN enables synergistic focusing'' of multi-scale features.
RCs extend the multi-level feature's divide-and-conquer'' mechanism of the FPN-based detectors to a wide range of scale-preferred tasks.
arXiv Detail & Related papers (2024-09-09T13:56:22Z) - Exploring the origins of switching dynamics in a multifunctional reservoir computer [0.0]
Reservoir computers (RCs) reconstruct multiple attractors simultaneously using the same set of trained weights.
In certain cases, if the RC fails to reconstruct a coexistence of attractors then it exhibits a form of metastability.
This paper explores the origins of these switching dynamics in a paradigmatic setting via the seeing double' problem.
arXiv Detail & Related papers (2024-08-27T20:51:48Z) - Leveraging Low-Rank and Sparse Recurrent Connectivity for Robust
Closed-Loop Control [63.310780486820796]
We show how a parameterization of recurrent connectivity influences robustness in closed-loop settings.
We find that closed-form continuous-time neural networks (CfCs) with fewer parameters can outperform their full-rank, fully-connected counterparts.
arXiv Detail & Related papers (2023-10-05T21:44:18Z) - RObotic MAnipulation Network (ROMAN) $\unicode{x2013}$ Hybrid
Hierarchical Learning for Solving Complex Sequential Tasks [70.69063219750952]
We present a Hybrid Hierarchical Learning framework, the Robotic Manipulation Network (ROMAN)
ROMAN achieves task versatility and robust failure recovery by integrating behavioural cloning, imitation learning, and reinforcement learning.
Experimental results show that by orchestrating and activating these specialised manipulation experts, ROMAN generates correct sequential activations for accomplishing long sequences of sophisticated manipulation tasks.
arXiv Detail & Related papers (2023-06-30T20:35:22Z) - Multifunctionality in a Connectome-Based Reservoir Computer [0.0]
The 'fruit fly RC' (FFRC) exhibits multifunctionality using the'seeing double' problem as a benchmark test.
Compared to the widely-used Erd"os-Renyi Reservoir Computer (ERRC), we report that the FFRC exhibits a greater capacity for multifunctionality.
arXiv Detail & Related papers (2023-06-02T19:37:38Z) - Exploring the limits of multifunctionality across different reservoir
computers [0.0]
We explore the performance of a continuous-time, leaky-integrator, and next-generation reservoir computer' (RC)
We train each RC to reconstruct a coexistence of chaotic attractors from different dynamical systems.
We examine the critical effects that certain parameters can have in each RC to achieve multifunctionality.
arXiv Detail & Related papers (2022-05-23T15:06:38Z) - Disentangling Spatial-Temporal Functional Brain Networks via
Twin-Transformers [12.137308815848717]
How to identify and characterize functional brain networks (BN) is fundamental to gain system-level insights into the mechanisms of brain organization architecture.
We propose a novel Twin-Transformers framework to simultaneously infer common and individual functional networks in both spatial and temporal space.
arXiv Detail & Related papers (2022-04-20T04:57:53Z) - Learning distinct features helps, provably [98.78384185493624]
We study the diversity of the features learned by a two-layer neural network trained with the least squares loss.
We measure the diversity by the average $L$-distance between the hidden-layer features.
arXiv Detail & Related papers (2021-06-10T19:14:45Z) - Phase Configuration Learning in Wireless Networks with Multiple
Reconfigurable Intelligent Surfaces [50.622375361505824]
Reconfigurable Intelligent Surfaces (RISs) are highly scalable technology capable of offering dynamic control of electro-magnetic wave propagation.
One of the major challenges with RIS-empowered wireless communications is the low-overhead dynamic configuration of multiple RISs.
We devise low-complexity supervised learning approaches for the RISs' phase configurations.
arXiv Detail & Related papers (2020-10-09T05:35:27Z) - Automated Search for Resource-Efficient Branched Multi-Task Networks [81.48051635183916]
We propose a principled approach, rooted in differentiable neural architecture search, to automatically define branching structures in a multi-task neural network.
We show that our approach consistently finds high-performing branching structures within limited resource budgets.
arXiv Detail & Related papers (2020-08-24T09:49:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.