Are Artificial Dendrites useful in NeuroEvolution?
- URL: http://arxiv.org/abs/2010.00918v2
- Date: Tue, 23 Feb 2021 12:42:21 GMT
- Title: Are Artificial Dendrites useful in NeuroEvolution?
- Authors: Larry Bull
- Abstract summary: This letter explores the effects of including a simple dendrite-inspired mechanism into neuroevolution.
The phenomenon of separate dendrite activation thresholds on connections is allowed to emerge under an evolutionary process.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The significant role of dendritic processing within neuronal networks has
become increasingly clear. This letter explores the effects of including a
simple dendrite-inspired mechanism into neuroevolution. The phenomenon of
separate dendrite activation thresholds on connections is allowed to emerge
under an evolutionary process. It is shown how such processing can be
positively selected for, particularly for connections between the hidden and
output layer, and increases performance.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
It has long been known in both neuroscience and AI that ''binding'' between neurons leads to a form of competitive learning.
We introduce Artificial rethinking together with arbitrary connectivity designs such as fully connected convolutional, or attentive mechanisms.
We show that this idea provides performance improvements across a wide spectrum of tasks such as unsupervised object discovery, adversarial robustness, uncertainty, and reasoning.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - A Goal-Driven Approach to Systems Neuroscience [2.6451153531057985]
Humans and animals exhibit a range of interesting behaviors in dynamic environments.
It is unclear how our brains actively reformat this dense sensory information to enable these behaviors.
We offer a new definition of interpretability that we show has promise in yielding unified structural and functional models of neural circuits.
arXiv Detail & Related papers (2023-11-05T16:37:53Z) - A versatile circuit for emulating active biological dendrites applied to
sound localisation and neuron imitation [0.0]
We introduce a versatile circuit that emulates a segment of a dendrite which exhibits gain, introduces delays, and performs integration.
We also find that dendrites can form bursting neurons.
This significant discovery suggests the potential to fabricate neural networks solely comprised of dendrite circuits.
arXiv Detail & Related papers (2023-10-25T09:42:24Z) - Mitigating Communication Costs in Neural Networks: The Role of Dendritic
Nonlinearity [28.243134476634125]
In this study, we scrutinized the importance of nonlinear dendrites within neural networks.
Our findings reveal that integrating dendritic structures can substantially enhance model capacity and performance.
arXiv Detail & Related papers (2023-06-21T00:28:20Z) - Artificial Dendritic Computation: The case for dendrites in neuromorphic
circuits [0.0]
We investigate the motivation for replicating dendritic computation and present a framework to guide future attempts.
We evaluate the impact of dendrites on an BiLSTM neural network's performance, finding that dendrite pre-processing reduce the size of network required for a threshold performance.
arXiv Detail & Related papers (2023-04-03T13:15:32Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Overcoming the Domain Gap in Contrastive Learning of Neural Action
Representations [60.47807856873544]
A fundamental goal in neuroscience is to understand the relationship between neural activity and behavior.
We generated a new multimodal dataset consisting of the spontaneous behaviors generated by fruit flies.
This dataset and our new set of augmentations promise to accelerate the application of self-supervised learning methods in neuroscience.
arXiv Detail & Related papers (2021-11-29T15:27:51Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - The distribution of inhibitory neurons in the C. elegans connectome
facilitates self-optimization of coordinated neural activity [78.15296214629433]
The nervous system of the nematode Caenorhabditis elegans exhibits remarkable complexity despite the worm's small size.
A general challenge is to better understand the relationship between neural organization and neural activity at the system level.
We implemented an abstract simulation model of the C. elegans connectome that approximates the neurotransmitter identity of each neuron.
arXiv Detail & Related papers (2020-10-28T23:11:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.