Wet TinyML: Chemical Neural Network Using Gene Regulation and Cell
Plasticity
- URL: http://arxiv.org/abs/2403.08549v1
- Date: Wed, 13 Mar 2024 14:00:18 GMT
- Title: Wet TinyML: Chemical Neural Network Using Gene Regulation and Cell
Plasticity
- Authors: Samitha Somathilaka, Adrian Ratwatte, Sasitharan Balasubramaniam,
Mehmet Can Vuran, Witawas Srisa-an, Pietro Li\`o
- Abstract summary: Wet TinyML is a form of chemical-based neural network based on gene regulatory network.
GRNNs can be used for conventional computing by employing an application-based search process.
We show that through the directed cell plasticity, we can extract the mathematical regression evolution enabling it to match to dynamic system applications.
- Score: 5.9659016963634315
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In our earlier work, we introduced the concept of Gene Regulatory Neural
Network (GRNN), which utilizes natural neural network-like structures inherent
in biological cells to perform computing tasks using chemical inputs. We define
this form of chemical-based neural network as Wet TinyML. The GRNN structures
are based on the gene regulatory network and have weights associated with each
link based on the estimated interactions between the genes. The GRNNs can be
used for conventional computing by employing an application-based search
process similar to the Network Architecture Search. This study advances this
concept by incorporating cell plasticity, to further exploit natural cell's
adaptability, in order to diversify the GRNN search that can match larger
spectrum as well as dynamic computing tasks. As an example application, we show
that through the directed cell plasticity, we can extract the mathematical
regression evolution enabling it to match to dynamic system applications. We
also conduct energy analysis by comparing the chemical energy of the GRNN to
its silicon counterpart, where this analysis includes both artificial neural
network algorithms executed on von Neumann architecture as well as neuromorphic
processors. The concept of Wet TinyML can pave the way for the new emergence of
chemical-based, energy-efficient and miniature Biological AI.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Inference of dynamical gene regulatory networks from single-cell data
with physics informed neural networks [0.0]
We show how physics-informed neural networks (PINNs) can be used to infer the parameters of predictive, dynamical GRNs.
Specifically we study GRNs that exhibit bifurcation behavior and can therefore model cell differentiation.
arXiv Detail & Related papers (2024-01-14T21:43:10Z) - Stability Analysis of Non-Linear Classifiers using Gene Regulatory
Neural Network for Biological AI [2.0755366440393743]
We develop a mathematical model of gene-perceptron using a dual-layered transcription-translation chemical reaction model.
We perform stability analysis for each gene-perceptron within the fully-connected GRNN sub network to determine temporal as well as stable concentration outputs.
arXiv Detail & Related papers (2023-09-14T21:37:38Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Inferring Gene Regulatory Neural Networks for Bacterial Decision Making
in Biofilms [4.459301404374565]
Bacterial cells are sensitive to a range of external signals used to learn the environment.
An inherited Gene Regulatory Neural Network (GRNN) behavior enables the cellular decision-making.
GRNNs can perform computational tasks for bio-hybrid computing systems.
arXiv Detail & Related papers (2023-01-10T22:07:33Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - A multi-agent model for growing spiking neural networks [0.0]
This project has explored rules for growing the connections between the neurons in Spiking Neural Networks as a learning mechanism.
Results in a simulation environment showed that for a given set of parameters it is possible to reach topologies that reproduce the tested functions.
This project also opens the door to the usage of techniques like genetic algorithms for obtaining the best suited values for the model parameters.
arXiv Detail & Related papers (2020-09-21T15:11:29Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Deep Molecular Programming: A Natural Implementation of Binary-Weight
ReLU Neural Networks [7.700240949386079]
We show how a BinaryConnect neural network trained in silico can be compiled to an equivalent chemical reaction network.
Our work sets the stage for rich knowledge transfer between neural network and molecular programming communities.
arXiv Detail & Related papers (2020-03-30T18:12:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.