Deep Convolutional Neural Networks: A survey of the foundations,
selected improvements, and some current applications
- URL: http://arxiv.org/abs/2011.12960v1
- Date: Wed, 25 Nov 2020 19:03:23 GMT
- Title: Deep Convolutional Neural Networks: A survey of the foundations,
selected improvements, and some current applications
- Authors: Lars Lien Ankile, Morgan Feet Heggland, Kjartan Krange
- Abstract summary: This paper seeks to present and discuss one such method, namely Convolutional Neural Networks (CNNs)
CNNs are deep neural networks that use a special linear operation called convolution.
This paper discusses two applications of convolution that have proven to be very effective in practice.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Within the world of machine learning there exists a wide range of different
methods with respective advantages and applications. This paper seeks to
present and discuss one such method, namely Convolutional Neural Networks
(CNNs). CNNs are deep neural networks that use a special linear operation
called convolution. This operation represents a key and distinctive element of
CNNs, and will therefore be the focus of this method paper. The discussion
starts with the theoretical foundations that underlie convolutions and CNNs.
Then, the discussion proceeds to discuss some improvements and augmentations
that can be made to adapt the method to estimate a wider set of function
classes. The paper mainly investigates two ways of improving the method: by
using locally connected layers, which can make the network less invariant to
translation, and tiled convolution, which allows for the learning of more
complex invariances than standard convolution. Furthermore, the use of the Fast
Fourier Transform can improve the computational efficiency of convolution.
Subsequently, this paper discusses two applications of convolution that have
proven to be very effective in practice. First, the YOLO architecture is a
state of the art neural network for image object classification, which
accurately predicts bounding boxes around objects in images. Second, tumor
detection in mammography may be performed using CNNs, accomplishing 7.2% higher
specificity than actual doctors with only .3% less sensitivity. Finally, the
invention of technology that outperforms humans in different fields also raises
certain ethical and regulatory questions that are briefly discussed.
Related papers
- Enhancing Convolutional Neural Networks with Higher-Order Numerical Difference Methods [6.26650196870495]
Convolutional Neural Networks (CNNs) have been able to assist humans in solving many real-world problems.
This paper proposes a stacking scheme based on the linear multi-step method to enhance the performance of CNNs.
arXiv Detail & Related papers (2024-09-08T05:13:58Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - What to Do When Your Discrete Optimization Is the Size of a Neural
Network? [24.546550334179486]
Machine learning applications using neural networks involve solving discrete optimization problems.
classical approaches used in discrete settings do not scale well to large neural networks.
We take continuation path (CP) methods to represent using purely the former and Monte Carlo (MC) methods to represent the latter.
arXiv Detail & Related papers (2024-02-15T21:57:43Z) - Training Convolutional Neural Networks with the Forward-Forward
algorithm [1.74440662023704]
Forward Forward (FF) algorithm has up to now only been used in fully connected networks.
We show how the FF paradigm can be extended to CNNs.
Our FF-trained CNN, featuring a novel spatially-extended labeling technique, achieves a classification accuracy of 99.16% on the MNIST hand-written digits dataset.
arXiv Detail & Related papers (2023-12-22T18:56:35Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Low-Energy Convolutional Neural Networks (CNNs) using Hadamard Method [0.0]
Convolutional neural networks (CNNs) are a potential approach for object recognition and detection.
A new approach based on the Hadamard transformation as an alternative to the convolution operation is demonstrated.
The method is helpful for other computer vision tasks when the kernel size is smaller than the input image size.
arXiv Detail & Related papers (2022-09-06T21:36:57Z) - Classifying high-dimensional Gaussian mixtures: Where kernel methods
fail and neural networks succeed [27.38015169185521]
We show theoretically that two-layer neural networks (2LNN) with only a few hidden neurons can beat the performance of kernel learning.
We show how over-parametrising the neural network leads to faster convergence, but does not improve its final performance.
arXiv Detail & Related papers (2021-02-23T15:10:15Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - EvoPose2D: Pushing the Boundaries of 2D Human Pose Estimation using
Accelerated Neuroevolution with Weight Transfer [82.28607779710066]
We explore the application of neuroevolution, a form of neural architecture search inspired by biological evolution, in the design of 2D human pose networks.
Our method produces network designs that are more efficient and more accurate than state-of-the-art hand-designed networks.
arXiv Detail & Related papers (2020-11-17T05:56:16Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Curriculum By Smoothing [52.08553521577014]
Convolutional Neural Networks (CNNs) have shown impressive performance in computer vision tasks such as image classification, detection, and segmentation.
We propose an elegant curriculum based scheme that smoothes the feature embedding of a CNN using anti-aliasing or low-pass filters.
As the amount of information in the feature maps increases during training, the network is able to progressively learn better representations of the data.
arXiv Detail & Related papers (2020-03-03T07:27:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.