Programming and Training Rate-Independent Chemical Reaction Networks
- URL: http://arxiv.org/abs/2109.11422v1
- Date: Mon, 20 Sep 2021 15:31:03 GMT
- Title: Programming and Training Rate-Independent Chemical Reaction Networks
- Authors: Marko Vasic, Cameron Chalk, Austin Luchsinger, Sarfraz Khurshid, and
David Soloveichik
- Abstract summary: Natural biochemical systems are typically modeled by chemical reaction networks (CRNs)
CRNs can be used as a specification language for synthetic chemical computation.
We show a technique to program NC-CRNs using well-founded deep learning methods.
- Score: 9.001036626196258
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Embedding computation in biochemical environments incompatible with
traditional electronics is expected to have wide-ranging impact in synthetic
biology, medicine, nanofabrication and other fields. Natural biochemical
systems are typically modeled by chemical reaction networks (CRNs), and CRNs
can be used as a specification language for synthetic chemical computation. In
this paper, we identify a class of CRNs called non-competitive (NC) whose
equilibria are absolutely robust to reaction rates and kinetic rate law,
because their behavior is captured solely by their stoichiometric structure.
Unlike prior work on rate-independent CRNs, checking non-competition and using
it as a design criterion is easy and promises robust output. We also present a
technique to program NC-CRNs using well-founded deep learning methods, showing
a translation procedure from rectified linear unit (ReLU) neural networks to
NC-CRNs. In the case of binary weight ReLU networks, our translation procedure
is surprisingly tight in the sense that a single bimolecular reaction
corresponds to a single ReLU node and vice versa. This compactness argues that
neural networks may be a fitting paradigm for programming rate-independent
chemical computation. As proof of principle, we demonstrate our scheme with
numerical simulations of CRNs translated from neural networks trained on
traditional machine learning datasets (IRIS and MNIST), as well as tasks better
aligned with potential biological applications including virus detection and
spatial pattern formation.
Related papers
- NeuralCRNs: A Natural Implementation of Learning in Chemical Reaction Networks [0.0]
We present a novel supervised learning framework constructed as a collection of deterministic chemical reaction networks (CRNs)
Unlike prior works, the NeuralCRNs framework is founded on dynamical system-based learning implementations and, thus, results in chemically compatible computations.
arXiv Detail & Related papers (2024-08-18T01:43:26Z) - CHANI: Correlation-based Hawkes Aggregation of Neurons with bio-Inspiration [7.26259898628108]
The present work aims at proving mathematically that a neural network inspired by biology can learn a classification task thanks to local transformations only.
We propose a spiking neural network named CHANI, whose neurons activity is modeled by Hawkes processes.
arXiv Detail & Related papers (2024-05-29T07:17:58Z) - Automatic Implementation of Neural Networks through Reaction Networks --
Part I: Circuit Design and Convergence Analysis [9.107489906506798]
This two-part article aims to introduce a programmable biochemical reaction network (BCRN) system endowed with mass action kinetics.
In part I, the feedforward propagation computation, the backpropagation component, and all bridging processes of FCNN are ingeniously designed as specific BCRN modules.
arXiv Detail & Related papers (2023-11-30T07:31:36Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Differentiable Programming of Chemical Reaction Networks [63.948465205530916]
Chemical reaction networks are one of the most fundamental computational substrates used by nature.
We study well-mixed single-chamber systems, as well as systems with multiple chambers separated by membranes.
We demonstrate that differentiable optimisation, combined with proper regularisation, can discover non-trivial sparse reaction networks.
arXiv Detail & Related papers (2023-02-06T11:41:14Z) - Neural-Symbolic Recursive Machine for Systematic Generalization [113.22455566135757]
We introduce the Neural-Symbolic Recursive Machine (NSR), whose core is a Grounded Symbol System (GSS)
NSR integrates neural perception, syntactic parsing, and semantic reasoning.
We evaluate NSR's efficacy across four challenging benchmarks designed to probe systematic generalization capabilities.
arXiv Detail & Related papers (2022-10-04T13:27:38Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Deep Molecular Programming: A Natural Implementation of Binary-Weight
ReLU Neural Networks [7.700240949386079]
We show how a BinaryConnect neural network trained in silico can be compiled to an equivalent chemical reaction network.
Our work sets the stage for rich knowledge transfer between neural network and molecular programming communities.
arXiv Detail & Related papers (2020-03-30T18:12:11Z) - Autonomous Discovery of Unknown Reaction Pathways from Data by Chemical
Reaction Neural Network [0.0]
Chemical reactions occur in energy, environmental, biological, and many other natural systems.
Here, we present a neural network approach that autonomously discovers reaction pathways from the time-resolved species concentration data.
The proposed Chemical Reaction Neural Network (CRNN), by design, satisfies the fundamental physics laws, including the Law of Mass Action and the Arrhenius Law.
arXiv Detail & Related papers (2020-02-20T23:36:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.