Programming molecular systems to emulate a learning spiking neuron
- URL: http://arxiv.org/abs/2206.02519v1
- Date: Mon, 9 May 2022 09:21:40 GMT
- Title: Programming molecular systems to emulate a learning spiking neuron
- Authors: Jakub Fil, Neil Dalchau, Dominique Chu
- Abstract summary: Hebbian theory seeks to explain how the neurons in the brain adapt to stimuli, to enable learning.
This paper explores how molecular systems can be designed to show such proto-intelligent behaviours.
We propose the first chemical reaction network that can exhibit autonomous Hebbian learning across arbitrarily many input channels.
- Score: 1.2707050104493216
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hebbian theory seeks to explain how the neurons in the brain adapt to
stimuli, to enable learning. An interesting feature of Hebbian learning is that
it is an unsupervised method and as such, does not require feedback, making it
suitable in contexts where systems have to learn autonomously. This paper
explores how molecular systems can be designed to show such proto-intelligent
behaviours, and proposes the first chemical reaction network (CRN) that can
exhibit autonomous Hebbian learning across arbitrarily many input channels. The
system emulates a spiking neuron, and we demonstrate that it can learn
statistical biases of incoming inputs. The basic CRN is a minimal,
thermodynamically plausible set of micro-reversible chemical equations that can
be analysed with respect to their energy requirements. However, to explore how
such chemical systems might be engineered de novo, we also propose an extended
version based on enzyme-driven compartmentalised reactions. Finally, we also
show how a purely DNA system, built upon the paradigm of DNA strand
displacement, can realise neuronal dynamics. Our analysis provides a compelling
blueprint for exploring autonomous learning in biological settings, bringing us
closer to realising real synthetic biological intelligence.
Related papers
- Hebbian Learning based Orthogonal Projection for Continual Learning of
Spiking Neural Networks [74.3099028063756]
We develop a new method with neuronal operations based on lateral connections and Hebbian learning.
We show that Hebbian and anti-Hebbian learning on recurrent lateral connections can effectively extract the principal subspace of neural activities.
Our method consistently solves for spiking neural networks with nearly zero forgetting.
arXiv Detail & Related papers (2024-02-19T09:29:37Z) - Brain-Inspired Machine Intelligence: A Survey of
Neurobiologically-Plausible Credit Assignment [65.268245109828]
We examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology.
We organize the ever-growing set of brain-inspired learning schemes into six general families and consider these in the context of backpropagation of errors.
The results of this review are meant to encourage future developments in neuro-mimetic systems and their constituent learning processes.
arXiv Detail & Related papers (2023-12-01T05:20:57Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Control of synaptic plasticity via the fusion of reinforcement learning
and unsupervised learning in neural networks [0.0]
In cognitive neuroscience, it is widely accepted that synaptic plasticity plays an essential role in our amazing learning capability.
With this inspiration, a new learning rule is proposed via the fusion of reinforcement learning and unsupervised learning.
In the proposed computational model, the nonlinear optimal control theory is used to resemble the error feedback loop systems.
arXiv Detail & Related papers (2023-03-26T12:18:03Z) - Sequence learning in a spiking neuronal network with memristive synapses [0.0]
A core concept that lies at the heart of brain computation is sequence learning and prediction.
Neuromorphic hardware emulates the way the brain processes information and maps neurons and synapses directly into a physical substrate.
We study the feasibility of using ReRAM devices as a replacement of the biological synapses in the sequence learning model.
arXiv Detail & Related papers (2022-11-29T21:07:23Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - A thermodynamically consistent chemical spiking neuron capable of
autonomous Hebbian learning [0.5874142059884521]
We propose a fully autonomous, thermodynamically consistent set of chemical reactions that implements a spiking neuron.
This chemical neuron is able to learn input patterns in a Hebbian fashion.
In addition to the thermodynamically consistent model of the CN, we also propose a biologically plausible version that could be engineered in a synthetic biology context.
arXiv Detail & Related papers (2020-09-28T10:43:13Z) - Brain-inspired self-organization with cellular neuromorphic computing
for multimodal unsupervised learning [0.0]
We propose a brain-inspired neural system based on the reentry theory using Self-Organizing Maps and Hebbian-like learning.
We show the gain of the so-called hardware plasticity induced by the ReSOM, where the system's topology is not fixed by the user but learned along the system's experience through self-organization.
arXiv Detail & Related papers (2020-04-11T21:02:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.