Deep Molecular Programming: A Natural Implementation of Binary-Weight
ReLU Neural Networks
- URL: http://arxiv.org/abs/2003.13720v3
- Date: Tue, 30 Jun 2020 15:38:06 GMT
- Title: Deep Molecular Programming: A Natural Implementation of Binary-Weight
ReLU Neural Networks
- Authors: Marko Vasic and Cameron Chalk and Sarfraz Khurshid and David
Soloveichik
- Abstract summary: We show how a BinaryConnect neural network trained in silico can be compiled to an equivalent chemical reaction network.
Our work sets the stage for rich knowledge transfer between neural network and molecular programming communities.
- Score: 7.700240949386079
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Embedding computation in molecular contexts incompatible with traditional
electronics is expected to have wide ranging impact in synthetic biology,
medicine, nanofabrication and other fields. A key remaining challenge lies in
developing programming paradigms for molecular computation that are
well-aligned with the underlying chemical hardware and do not attempt to
shoehorn ill-fitting electronics paradigms. We discover a surprisingly tight
connection between a popular class of neural networks (binary-weight ReLU aka
BinaryConnect) and a class of coupled chemical reactions that are absolutely
robust to reaction rates. The robustness of rate-independent chemical
computation makes it a promising target for bioengineering implementation. We
show how a BinaryConnect neural network trained in silico using well-founded
deep learning optimization techniques, can be compiled to an equivalent
chemical reaction network, providing a novel molecular programming paradigm. We
illustrate such translation on the paradigmatic IRIS and MNIST datasets. Toward
intended applications of chemical computation, we further use our method to
generate a chemical reaction network that can discriminate between different
virus types based on gene expression levels. Our work sets the stage for rich
knowledge transfer between neural network and molecular programming
communities.
Related papers
- Unsupervised representation learning with Hebbian synaptic and structural plasticity in brain-like feedforward neural networks [0.0]
We introduce and evaluate a brain-like neural network model capable of unsupervised representation learning.
The model was tested on a diverse set of popular machine learning benchmarks.
arXiv Detail & Related papers (2024-06-07T08:32:30Z) - Wet TinyML: Chemical Neural Network Using Gene Regulation and Cell
Plasticity [5.9659016963634315]
Wet TinyML is a form of chemical-based neural network based on gene regulatory network.
GRNNs can be used for conventional computing by employing an application-based search process.
We show that through the directed cell plasticity, we can extract the mathematical regression evolution enabling it to match to dynamic system applications.
arXiv Detail & Related papers (2024-03-13T14:00:18Z) - A Review of Neuroscience-Inspired Machine Learning [58.72729525961739]
Bio-plausible credit assignment is compatible with practically any learning condition and is energy-efficient.
In this paper, we survey several vital algorithms that model bio-plausible rules of credit assignment in artificial neural networks.
We conclude by discussing the future challenges that will need to be addressed in order to make such algorithms more useful in practical applications.
arXiv Detail & Related papers (2024-02-16T18:05:09Z) - Automatic Implementation of Neural Networks through Reaction Networks --
Part I: Circuit Design and Convergence Analysis [9.107489906506798]
This two-part article aims to introduce a programmable biochemical reaction network (BCRN) system endowed with mass action kinetics.
In part I, the feedforward propagation computation, the backpropagation component, and all bridging processes of FCNN are ingeniously designed as specific BCRN modules.
arXiv Detail & Related papers (2023-11-30T07:31:36Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Differentiable Programming of Chemical Reaction Networks [63.948465205530916]
Chemical reaction networks are one of the most fundamental computational substrates used by nature.
We study well-mixed single-chamber systems, as well as systems with multiple chambers separated by membranes.
We demonstrate that differentiable optimisation, combined with proper regularisation, can discover non-trivial sparse reaction networks.
arXiv Detail & Related papers (2023-02-06T11:41:14Z) - ViSNet: an equivariant geometry-enhanced graph neural network with
vector-scalar interactive message passing for molecules [69.05950120497221]
We propose an equivariant geometry-enhanced graph neural network called ViSNet, which elegantly extracts geometric features and efficiently models molecular structures.
Our proposed ViSNet outperforms state-of-the-art approaches on multiple MD benchmarks, including MD17, revised MD17 and MD22, and achieves excellent chemical property prediction on QM9 and Molecule3D datasets.
arXiv Detail & Related papers (2022-10-29T07:12:46Z) - A photonic chip-based machine learning approach for the prediction of
molecular properties [11.55177943027656]
Photonic chip technology offers an alternative platform for implementing neural network with faster data processing and lower energy usage.
We demonstrate the capability of photonic neural networks in predicting the quantum mechanical properties of molecules.
Our work opens the avenue for harnessing photonic technology for large-scale machine learning applications in molecular sciences.
arXiv Detail & Related papers (2022-03-03T03:15:14Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Programming and Training Rate-Independent Chemical Reaction Networks [9.001036626196258]
Natural biochemical systems are typically modeled by chemical reaction networks (CRNs)
CRNs can be used as a specification language for synthetic chemical computation.
We show a technique to program NC-CRNs using well-founded deep learning methods.
arXiv Detail & Related papers (2021-09-20T15:31:03Z) - Credit Assignment in Neural Networks through Deep Feedback Control [59.14935871979047]
Deep Feedback Control (DFC) is a new learning method that uses a feedback controller to drive a deep neural network to match a desired output target and whose control signal can be used for credit assignment.
The resulting learning rule is fully local in space and time and approximates Gauss-Newton optimization for a wide range of connectivity patterns.
To further underline its biological plausibility, we relate DFC to a multi-compartment model of cortical pyramidal neurons with a local voltage-dependent synaptic plasticity rule, consistent with recent theories of dendritic processing.
arXiv Detail & Related papers (2021-06-15T05:30:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.