Context-sensitive neocortical neurons transform the effectiveness and
efficiency of neural information processing
- URL: http://arxiv.org/abs/2207.07338v6
- Date: Tue, 4 Apr 2023 10:44:38 GMT
- Title: Context-sensitive neocortical neurons transform the effectiveness and
efficiency of neural information processing
- Authors: Ahsan Adeel, Mario Franco, Mohsin Raza, Khubaib Ahmed
- Abstract summary: Deep learning (DL) has big-data processing capabilities that are as good, or even better, than those of humans in many real-world domains.
But at the cost of high energy requirements that may be unsustainable in some applications and of errors, that, though infrequent, can be large.
We show how to circumvent these limitations by mimicking the capabilities of context-sensitive neocortical neurons.
- Score: 0.783788180051711
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Deep learning (DL) has big-data processing capabilities that are as good, or
even better, than those of humans in many real-world domains, but at the cost
of high energy requirements that may be unsustainable in some applications and
of errors, that, though infrequent, can be large. We hypothesise that a
fundamental weakness of DL lies in its intrinsic dependence on
integrate-and-fire point neurons that maximise information transmission
irrespective of whether it is relevant in the current context or not. This
leads to unnecessary neural firing and to the feedforward transmission of
conflicting messages, which makes learning difficult and processing energy
inefficient. Here we show how to circumvent these limitations by mimicking the
capabilities of context-sensitive neocortical neurons that receive input from
diverse sources as a context to amplify and attenuate the transmission of
relevant and irrelevant information, respectively. We demonstrate that a deep
network composed of such local processors seeks to maximise agreement between
the active neurons, thus restricting the transmission of conflicting
information to higher levels and reducing the neural activity required to
process large amounts of heterogeneous real-world data. As shown to be far more
effective and efficient than current forms of DL, this two-point neuron study
offers a possible step-change in transforming the cellular foundations of deep
network architectures.
Related papers
- A frugal Spiking Neural Network for unsupervised classification of continuous multivariate temporal data [0.0]
Spiking Neural Networks (SNNs) are neuromorphic and use more biologically plausible neurons with evolving membrane potentials.
We introduce here a frugal single-layer SNN designed for fully unsupervised identification and classification of multivariate temporal patterns in continuous data.
arXiv Detail & Related papers (2024-08-08T08:15:51Z) - Neuromorphic Event-Driven Semantic Communication in Microgrids [5.817656520396958]
This paper proposes neuromorphic learning to implant communicative features using spiking neural networks (SNNs) at each node.
As opposed to the conventional neuromorphic sensors that operate with spiking signals, we employ an event-driven selective process to collect sparse data for training of SNNs.
arXiv Detail & Related papers (2024-02-28T15:11:02Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - Solving Large-scale Spatial Problems with Convolutional Neural Networks [88.31876586547848]
We employ transfer learning to improve training efficiency for large-scale spatial problems.
We propose that a convolutional neural network (CNN) can be trained on small windows of signals, but evaluated on arbitrarily large signals with little to no performance degradation.
arXiv Detail & Related papers (2023-06-14T01:24:42Z) - Impact of spiking neurons leakages and network recurrences on
event-based spatio-temporal pattern recognition [0.0]
Spiking neural networks coupled with neuromorphic hardware and event-based sensors are getting increased interest for low-latency and low-power inference at the edge.
We explore the impact of synaptic and membrane leakages in spiking neurons.
arXiv Detail & Related papers (2022-11-14T21:34:02Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Overcoming the Domain Gap in Contrastive Learning of Neural Action
Representations [60.47807856873544]
A fundamental goal in neuroscience is to understand the relationship between neural activity and behavior.
We generated a new multimodal dataset consisting of the spontaneous behaviors generated by fruit flies.
This dataset and our new set of augmentations promise to accelerate the application of self-supervised learning methods in neuroscience.
arXiv Detail & Related papers (2021-11-29T15:27:51Z) - Learning in Feedforward Neural Networks Accelerated by Transfer Entropy [0.0]
The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series)
Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks.
We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.
arXiv Detail & Related papers (2021-04-29T19:07:07Z) - And/or trade-off in artificial neurons: impact on adversarial robustness [91.3755431537592]
Presence of sufficient number of OR-like neurons in a network can lead to classification brittleness and increased vulnerability to adversarial attacks.
We define AND-like neurons and propose measures to increase their proportion in the network.
Experimental results on the MNIST dataset suggest that our approach holds promise as a direction for further exploration.
arXiv Detail & Related papers (2021-02-15T08:19:05Z) - Information contraction in noisy binary neural networks and its
implications [11.742803725197506]
We consider noisy binary neural networks, where each neuron has a non-zero probability of producing an incorrect output.
Our key finding is a lower bound for the required number of neurons in noisy neural networks, which is first of its kind.
This paper offers new understanding of noisy information processing systems through the lens of information theory.
arXiv Detail & Related papers (2021-01-28T00:01:45Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.