Convolutional Neural Networks Demystified: A Matched Filtering
Perspective Based Tutorial
- URL: http://arxiv.org/abs/2108.11663v1
- Date: Thu, 26 Aug 2021 09:07:49 GMT
- Title: Convolutional Neural Networks Demystified: A Matched Filtering
Perspective Based Tutorial
- Authors: Ljubisa Stankovic and Danilo Mandic
- Abstract summary: Convolutional Neural Networks (CNN) are a de-facto standard for the analysis of large volumes of signals and images.
We revisit their operation from first principles and a matched filtering perspective.
It is our hope that this tutorial will help shed new light and physical intuition into the understanding and further development of deep neural networks.
- Score: 7.826806223782053
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep Neural Networks (DNN) and especially Convolutional Neural Networks (CNN)
are a de-facto standard for the analysis of large volumes of signals and
images. Yet, their development and underlying principles have been largely
performed in an ad-hoc and black box fashion. To help demystify CNNs, we
revisit their operation from first principles and a matched filtering
perspective. We establish that the convolution operation within CNNs, their
very backbone, represents a matched filter which examines the input
signal/image for the presence of pre-defined features. This perspective is
shown to be physically meaningful, and serves as a basis for a step-by-step
tutorial on the operation of CNNs, including pooling, zero padding, various
ways of dimensionality reduction. Starting from first principles, both the
feed-forward pass and the learning stage (via back-propagation) are illuminated
in detail, both through a worked-out numerical example and the corresponding
visualizations. It is our hope that this tutorial will help shed new light and
physical intuition into the understanding and further development of deep
neural networks.
Related papers
- Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - Neural networks trained with SGD learn distributions of increasing
complexity [78.30235086565388]
We show that neural networks trained using gradient descent initially classify their inputs using lower-order input statistics.
We then exploit higher-order statistics only later during training.
We discuss the relation of DSB to other simplicity biases and consider its implications for the principle of universality in learning.
arXiv Detail & Related papers (2022-11-21T15:27:22Z) - Demystifying CNNs for Images by Matched Filters [13.121514086503591]
convolution neural networks (CNN) have been revolutionising the way we approach and use intelligent machines in the Big Data era.
CNNs have been put under scrutiny owing to their textitblack-box nature, as well as the lack of theoretical support and physical meanings of their operation.
This paper attempts to demystify the operation of CNNs by employing the perspective of matched filtering.
arXiv Detail & Related papers (2022-10-16T12:39:17Z) - Visual Explanations for Convolutional Neural Networks via Latent
Traversal of Generative Adversarial Networks [17.475341881835355]
We present a method for interpreting what a convolutional neural network (CNN) has learned by utilizing Generative Adversarial Networks (GANs)
Our GAN framework disentangles lung structure from COVID-19 features. Using this GAN, we can visualize the transition of a pair of COVID negative lungs in a chest radiograph to a COVID positive pair by interpolating in the latent space of the GAN.
arXiv Detail & Related papers (2021-10-29T23:26:09Z) - The Mind's Eye: Visualizing Class-Agnostic Features of CNNs [92.39082696657874]
We propose an approach to visually interpret CNN features given a set of images by creating corresponding images that depict the most informative features of a specific layer.
Our method uses a dual-objective activation and distance loss, without requiring a generator network nor modifications to the original model.
arXiv Detail & Related papers (2021-01-29T07:46:39Z) - Shape or Texture: Understanding Discriminative Features in CNNs [28.513300496205044]
Recent studies have shown that CNNs actually exhibit a texture bias'
We show that a network learns the majority of overall shape information at the first few epochs of training.
We also show that the encoding of shape does not imply the encoding of localized per-pixel semantic information.
arXiv Detail & Related papers (2021-01-27T18:54:00Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - An Information-theoretic Visual Analysis Framework for Convolutional
Neural Networks [11.15523311079383]
We introduce a data model to organize the data that can be extracted from CNN models.
We then propose two ways to calculate entropy under different circumstances.
We develop a visual analysis system, CNNSlicer, to interactively explore the amount of information changes inside the model.
arXiv Detail & Related papers (2020-05-02T21:36:50Z) - Curriculum By Smoothing [52.08553521577014]
Convolutional Neural Networks (CNNs) have shown impressive performance in computer vision tasks such as image classification, detection, and segmentation.
We propose an elegant curriculum based scheme that smoothes the feature embedding of a CNN using anti-aliasing or low-pass filters.
As the amount of information in the feature maps increases during training, the network is able to progressively learn better representations of the data.
arXiv Detail & Related papers (2020-03-03T07:27:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.