Convolutional Neural Networks with Gated Recurrent Connections
- URL: http://arxiv.org/abs/2106.02859v1
- Date: Sat, 5 Jun 2021 10:14:59 GMT
- Title: Convolutional Neural Networks with Gated Recurrent Connections
- Authors: Jianfeng Wang, Xiaolin Hu
- Abstract summary: recurrent convolution neural network (RCNN) is inspired by abundant recurrent connections in the visual systems of animals.
We propose to modulate the receptive fields (RFs) of neurons by introducing gates to the recurrent connections.
The GRCNN was evaluated on several computer vision tasks including object recognition, scene text recognition and object detection.
- Score: 25.806036745901114
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The convolutional neural network (CNN) has become a basic model for solving
many computer vision problems. In recent years, a new class of CNNs, recurrent
convolution neural network (RCNN), inspired by abundant recurrent connections
in the visual systems of animals, was proposed. The critical element of RCNN is
the recurrent convolutional layer (RCL), which incorporates recurrent
connections between neurons in the standard convolutional layer. With
increasing number of recurrent computations, the receptive fields (RFs) of
neurons in RCL expand unboundedly, which is inconsistent with biological facts.
We propose to modulate the RFs of neurons by introducing gates to the recurrent
connections. The gates control the amount of context information inputting to
the neurons and the neurons' RFs therefore become adaptive. The resulting layer
is called gated recurrent convolution layer (GRCL). Multiple GRCLs constitute a
deep model called gated RCNN (GRCNN). The GRCNN was evaluated on several
computer vision tasks including object recognition, scene text recognition and
object detection, and obtained much better results than the RCNN. In addition,
when combined with other adaptive RF techniques, the GRCNN demonstrated
competitive performance to the state-of-the-art models on benchmark datasets
for these tasks. The codes are released at
\href{https://github.com/Jianf-Wang/GRCNN}{https://github.com/Jianf-Wang/GRCNN}.
Related papers
- Investigating Sparsity in Recurrent Neural Networks [0.0]
This thesis focuses on investigating the effects of pruning and Sparse Recurrent Neural Networks on the performance of RNNs.
We first describe the pruning of RNNs, its impact on the performance of RNNs, and the number of training epochs required to regain accuracy after the pruning is performed.
Next, we continue with the creation and training of Sparse Recurrent Neural Networks and identify the relation between the performance and the graph properties of its underlying arbitrary structure.
arXiv Detail & Related papers (2024-07-30T07:24:58Z) - Accurate Mapping of RNNs on Neuromorphic Hardware with Adaptive Spiking Neurons [2.9410174624086025]
We present a $SigmaDelta$-low-pass RNN (lpRNN) for mapping rate-based RNNs to spiking neural networks (SNNs)
An adaptive spiking neuron model encodes signals using $SigmaDelta$-modulation and enables precise mapping.
We demonstrate the implementation of the lpRNN on Intel's neuromorphic research chip Loihi.
arXiv Detail & Related papers (2024-07-18T14:06:07Z) - Random-coupled Neural Network [17.53731608985241]
Pulse-coupled neural network (PCNN) is a well applicated model for imitating the characteristics of the human brain in computer vision and neural network fields.
In this study, random-coupled neural network (RCNN) is proposed.
It overcomes difficulties in PCNN's neuromorphic computing via a random inactivation process.
arXiv Detail & Related papers (2024-03-26T09:13:06Z) - Heterogeneous Recurrent Spiking Neural Network for Spatio-Temporal
Classification [13.521272923545409]
Spi Neural Networks are often touted as brain-inspired learning models for the third wave of Artificial Intelligence.
This paper presents a heterogeneous spiking neural network (HRSNN) with unsupervised learning for video recognition tasks.
We show that HRSNN can achieve similar performance to state-of-the-temporal backpropagation trained supervised SNN, but with less computation.
arXiv Detail & Related papers (2022-09-22T16:34:01Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Deep Neural Networks using a Single Neuron: Folded-in-Time Architecture
using Feedback-Modulated Delay Loops [0.0]
We present a method for folding a deep neural network of arbitrary size into a single neuron with multiple time-delayed feedback loops.
This single-neuron deep neural network comprises only a single nonlinearity and appropriately adjusted modulations of the feedback signals.
The new method, which we call Folded-in-time DNN (Fit-DNN), exhibits promising performance in a set of benchmark tasks.
arXiv Detail & Related papers (2020-11-19T21:45:58Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.