Automatic Implementation of Neural Networks through Reaction Networks --
Part I: Circuit Design and Convergence Analysis
- URL: http://arxiv.org/abs/2311.18313v1
- Date: Thu, 30 Nov 2023 07:31:36 GMT
- Title: Automatic Implementation of Neural Networks through Reaction Networks --
Part I: Circuit Design and Convergence Analysis
- Authors: Yuzhen Fan, Xiaoyu Zhang, Chuanhou Gao, Denis Dochain
- Abstract summary: This two-part article aims to introduce a programmable biochemical reaction network (BCRN) system endowed with mass action kinetics.
In part I, the feedforward propagation computation, the backpropagation component, and all bridging processes of FCNN are ingeniously designed as specific BCRN modules.
- Score: 9.107489906506798
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Information processing relying on biochemical interactions in the cellular
environment is essential for biological organisms. The implementation of
molecular computational systems holds significant interest and potential in the
fields of synthetic biology and molecular computation. This two-part article
aims to introduce a programmable biochemical reaction network (BCRN) system
endowed with mass action kinetics that realizes the fully connected neural
network (FCNN) and has the potential to act automatically in vivo. In part I,
the feedforward propagation computation, the backpropagation component, and all
bridging processes of FCNN are ingeniously designed as specific BCRN modules
based on their dynamics. This approach addresses a design gap in the
biochemical assignment module and judgment termination module and provides a
novel precise and robust realization of bi-molecular reactions for the learning
process. Through equilibrium approaching, we demonstrate that the designed BCRN
system achieves FCNN functionality with exponential convergence to target
computational results, thereby enhancing the theoretical support for such work.
Finally, the performance of this construction is further evaluated on two
typical logic classification problems.
Related papers
- Wet TinyML: Chemical Neural Network Using Gene Regulation and Cell
Plasticity [5.9659016963634315]
Wet TinyML is a form of chemical-based neural network based on gene regulatory network.
GRNNs can be used for conventional computing by employing an application-based search process.
We show that through the directed cell plasticity, we can extract the mathematical regression evolution enabling it to match to dynamic system applications.
arXiv Detail & Related papers (2024-03-13T14:00:18Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems [73.18020682258606]
We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel.
We propose an event-based generalization of forward-forward learning, which we call contrastive-signal-dependent plasticity (CSDP)
Our experimental results on several pattern datasets demonstrate that the CSDP process works well for training a dynamic recurrent spiking network capable of both classification and reconstruction.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Programming and Training Rate-Independent Chemical Reaction Networks [9.001036626196258]
Natural biochemical systems are typically modeled by chemical reaction networks (CRNs)
CRNs can be used as a specification language for synthetic chemical computation.
We show a technique to program NC-CRNs using well-founded deep learning methods.
arXiv Detail & Related papers (2021-09-20T15:31:03Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Deep Molecular Programming: A Natural Implementation of Binary-Weight
ReLU Neural Networks [7.700240949386079]
We show how a BinaryConnect neural network trained in silico can be compiled to an equivalent chemical reaction network.
Our work sets the stage for rich knowledge transfer between neural network and molecular programming communities.
arXiv Detail & Related papers (2020-03-30T18:12:11Z) - Autonomous Discovery of Unknown Reaction Pathways from Data by Chemical
Reaction Neural Network [0.0]
Chemical reactions occur in energy, environmental, biological, and many other natural systems.
Here, we present a neural network approach that autonomously discovers reaction pathways from the time-resolved species concentration data.
The proposed Chemical Reaction Neural Network (CRNN), by design, satisfies the fundamental physics laws, including the Law of Mass Action and the Arrhenius Law.
arXiv Detail & Related papers (2020-02-20T23:36:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.