Convolutional Neural Networks Exploiting Attributes of Biological
Neurons
- URL: http://arxiv.org/abs/2311.08314v1
- Date: Tue, 14 Nov 2023 16:58:18 GMT
- Title: Convolutional Neural Networks Exploiting Attributes of Biological
Neurons
- Authors: Neeraj Kumar Singh, Nikhil R. Pal
- Abstract summary: Deep neural networks like Convolutional Neural Networks (CNNs) have emerged as front-runners, often surpassing human capabilities.
Here, we integrate the principles of biological neurons in certain layer(s) of CNNs.
We aim to extract image features to use as input to CNNs, hoping to enhance training efficiency and achieve better accuracy.
- Score: 7.3517426088986815
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this era of artificial intelligence, deep neural networks like
Convolutional Neural Networks (CNNs) have emerged as front-runners, often
surpassing human capabilities. These deep networks are often perceived as the
panacea for all challenges. Unfortunately, a common downside of these networks
is their ''black-box'' character, which does not necessarily mirror the
operation of biological neural systems. Some even have millions/billions of
learnable (tunable) parameters, and their training demands extensive data and
time.
Here, we integrate the principles of biological neurons in certain layer(s)
of CNNs. Specifically, we explore the use of neuro-science-inspired
computational models of the Lateral Geniculate Nucleus (LGN) and simple cells
of the primary visual cortex. By leveraging such models, we aim to extract
image features to use as input to CNNs, hoping to enhance training efficiency
and achieve better accuracy. We aspire to enable shallow networks with a
Push-Pull Combination of Receptive Fields (PP-CORF) model of simple cells as
the foundation layer of CNNs to enhance their learning process and performance.
To achieve this, we propose a two-tower CNN, one shallow tower and the other as
ResNet 18. Rather than extracting the features blindly, it seeks to mimic how
the brain perceives and extracts features. The proposed system exhibits a
noticeable improvement in the performance (on an average of $5\%-10\%$) on
CIFAR-10, CIFAR-100, and ImageNet-100 datasets compared to ResNet-18. We also
check the efficiency of only the Push-Pull tower of the network.
Related papers
- NEAR: A Training-Free Pre-Estimator of Machine Learning Model Performance [0.0]
We propose a zero-cost proxy Network Expressivity by Activation Rank (NEAR) to identify the optimal neural network without training.
We demonstrate the cutting-edge correlation between this network score and the model accuracy on NAS-Bench-101 and NATS-Bench-SSS/TSS.
arXiv Detail & Related papers (2024-08-16T14:38:14Z) - Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Training Convolutional Neural Networks with the Forward-Forward
algorithm [1.74440662023704]
Forward Forward (FF) algorithm has up to now only been used in fully connected networks.
We show how the FF paradigm can be extended to CNNs.
Our FF-trained CNN, featuring a novel spatially-extended labeling technique, achieves a classification accuracy of 99.16% on the MNIST hand-written digits dataset.
arXiv Detail & Related papers (2023-12-22T18:56:35Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Learning from Event Cameras with Sparse Spiking Convolutional Neural
Networks [0.0]
Convolutional neural networks (CNNs) are now the de facto solution for computer vision problems.
We propose an end-to-end biologically inspired approach using event cameras and spiking neural networks (SNNs)
Our method enables the training of sparse spiking neural networks directly on event data, using the popular deep learning framework PyTorch.
arXiv Detail & Related papers (2021-04-26T13:52:01Z) - BreakingBED -- Breaking Binary and Efficient Deep Neural Networks by
Adversarial Attacks [65.2021953284622]
We study robustness of CNNs against white-box and black-box adversarial attacks.
Results are shown for distilled CNNs, agent-based state-of-the-art pruned models, and binarized neural networks.
arXiv Detail & Related papers (2021-03-14T20:43:19Z) - Combining Spiking Neural Network and Artificial Neural Network for
Enhanced Image Classification [1.8411688477000185]
spiking neural networks (SNNs) that more closely resemble biological brain synapses have attracted attention owing to their low power consumption.
We build versatile hybrid neural networks (HNNs) that improve the concerned performance.
arXiv Detail & Related papers (2021-02-21T12:03:16Z) - The Mind's Eye: Visualizing Class-Agnostic Features of CNNs [92.39082696657874]
We propose an approach to visually interpret CNN features given a set of images by creating corresponding images that depict the most informative features of a specific layer.
Our method uses a dual-objective activation and distance loss, without requiring a generator network nor modifications to the original model.
arXiv Detail & Related papers (2021-01-29T07:46:39Z) - Effective and Efficient Computation with Multiple-timescale Spiking
Recurrent Neural Networks [0.9790524827475205]
We show how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance.
We calculate a $>$100x energy improvement for our SRNNs over classical RNNs on the harder tasks.
arXiv Detail & Related papers (2020-05-24T01:04:53Z) - Curriculum By Smoothing [52.08553521577014]
Convolutional Neural Networks (CNNs) have shown impressive performance in computer vision tasks such as image classification, detection, and segmentation.
We propose an elegant curriculum based scheme that smoothes the feature embedding of a CNN using anti-aliasing or low-pass filters.
As the amount of information in the feature maps increases during training, the network is able to progressively learn better representations of the data.
arXiv Detail & Related papers (2020-03-03T07:27:44Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.