Biological neurons act as generalization filters in reservoir computing
- URL: http://arxiv.org/abs/2210.02913v1
- Date: Thu, 6 Oct 2022 13:32:26 GMT
- Title: Biological neurons act as generalization filters in reservoir computing
- Authors: Takuma Sumi, Hideaki Yamamoto, Yuichi Katori, Satoshi Moriya, Tomohiro
Konno, Shigeo Sato, Ayumi Hirano-Iwata
- Abstract summary: Reservoir computing is a machine learning paradigm that transforms the transient dynamics of high-dimensional nonlinear systems for processing time-series data.
Here, we use optogenetics and fluorescent calcium imaging to record the multicellular responses of cultured biological neuronal networks (BNNs)
We show that modular BNNs can be used to classify static input patterns with a linear decoder and that the modularity of the BNNs positively correlates with the classification accuracy.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing is a machine learning paradigm that transforms the
transient dynamics of high-dimensional nonlinear systems for processing
time-series data. Although reservoir computing was initially proposed to model
information processing in the mammalian cortex, it remains unclear how the
non-random network architecture, such as the modular architecture, in the
cortex integrates with the biophysics of living neurons to characterize the
function of biological neuronal networks (BNNs). Here, we used optogenetics and
fluorescent calcium imaging to record the multicellular responses of cultured
BNNs and employed the reservoir computing framework to decode their
computational capabilities. Micropatterned substrates were used to embed the
modular architecture in the BNNs. We first show that modular BNNs can be used
to classify static input patterns with a linear decoder and that the modularity
of the BNNs positively correlates with the classification accuracy. We then
used a timer task to verify that BNNs possess a short-term memory of ~1 s and
finally show that this property can be exploited for spoken digit
classification. Interestingly, BNN-based reservoirs allow transfer learning,
wherein a network trained on one dataset can be used to classify separate
datasets of the same category. Such classification was not possible when the
input patterns were directly decoded by a linear decoder, suggesting that BNNs
act as a generalization filter to improve reservoir computing performance. Our
findings pave the way toward a mechanistic understanding of information
processing within BNNs and, simultaneously, build future expectations toward
the realization of physical reservoir computing systems based on BNNs.
Related papers
- Integration of Contrastive Predictive Coding and Spiking Neural Networks [0.0]
This study examines the integration of Contrastive Predictive Coding (CPC) with Spiking Neural Networks (SNN)<n>The goal is to develop a predictive coding model with greater biological plausibility by processing inputs and outputs in a spike-based system.<n>The study demonstrates that CPC can be effectively combined with SNN, showing that an SNN trained for classification tasks can also function as an encoding mechanism.
arXiv Detail & Related papers (2025-06-10T19:23:08Z) - Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Use of Parallel Explanatory Models to Enhance Transparency of Neural Network Configurations for Cell Degradation Detection [18.214293024118145]
We build a parallel model to illuminate and understand the internal operation of neural networks.
We show how each layer of the RNN transforms the input distributions to increase detection accuracy.
At the same time we also discover a side effect acting to limit the improvement in accuracy.
arXiv Detail & Related papers (2024-04-17T12:22:54Z) - An exact mathematical description of computation with transient
spatiotemporal dynamics in a complex-valued neural network [33.7054351451505]
We study a complex-valued neural network (-NN) with linear time-delayed interactions.
cv-NN displays sophisticated dynamics, including partially synchronized chimera adaptable'' states.
We demonstrate that computations in cv-NN computation are decodable by living biological neurons.
arXiv Detail & Related papers (2023-11-28T02:23:30Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - A Temporal Neural Network Architecture for Online Learning [0.6091702876917281]
Temporal neural networks (TNNs) communicate and process information encoded as relative spike times.
A TNN architecture is proposed and, as a proof-of-concept, TNN operation is demonstrated within the larger context of online supervised classification.
arXiv Detail & Related papers (2020-11-27T17:15:29Z) - Multi-Sample Online Learning for Probabilistic Spiking Neural Networks [43.8805663900608]
Spiking Neural Networks (SNNs) capture some of the efficiency of biological brains for inference and learning.
This paper introduces an online learning rule based on generalized expectation-maximization (GEM)
Experimental results on structured output memorization and classification on a standard neuromorphic data set demonstrate significant improvements in terms of log-likelihood, accuracy, and calibration.
arXiv Detail & Related papers (2020-07-23T10:03:58Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Exploiting Neuron and Synapse Filter Dynamics in Spatial Temporal
Learning of Deep Spiking Neural Network [7.503685643036081]
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
We formulate SNN as a network of infinite impulse response (IIR) filters with neuron nonlinearity.
We propose a training algorithm that is capable to learn spatial-temporal patterns by searching for the optimal synapse filter kernels and weights.
arXiv Detail & Related papers (2020-02-19T01:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.