Regularized Pooling
- URL: http://arxiv.org/abs/2005.03709v2
- Date: Thu, 6 Aug 2020 07:10:34 GMT
- Title: Regularized Pooling
- Authors: Takato Otsuzuki, Hideaki Hayashi, Yuchen Zheng and Seiichi Uchida
- Abstract summary: In convolutional neural networks (CNNs), pooling operations play important roles such as dimensionality reduction and deformation compensation.
We propose regularized pooling, which enables the value selection direction in the pooling operation to be spatially smooth across adjacent kernels.
- Score: 12.387676601792899
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In convolutional neural networks (CNNs), pooling operations play important
roles such as dimensionality reduction and deformation compensation. In
general, max pooling, which is the most widely used operation for local
pooling, is performed independently for each kernel. However, the deformation
may be spatially smooth over the neighboring kernels. This means that max
pooling is too flexible to compensate for actual deformations. In other words,
its excessive flexibility risks canceling the essential spatial differences
between classes. In this paper, we propose regularized pooling, which enables
the value selection direction in the pooling operation to be spatially smooth
across adjacent kernels so as to compensate only for actual deformations. The
results of experiments on handwritten character images and texture images
showed that regularized pooling not only improves recognition accuracy but also
accelerates the convergence of learning compared with conventional pooling
operations.
Related papers
- MorphPool: Efficient Non-linear Pooling & Unpooling in CNNs [9.656707333320037]
Pooling is essentially an operation from the field of Mathematical Morphology, with max pooling as a limited special case.
In addition to pooling operations, encoder-decoder networks used for pixel-level predictions also require unpooling.
Extensive experimentation on two tasks and three large-scale datasets shows that morphological pooling and unpooling lead to improved predictive performance at much reduced parameter counts.
arXiv Detail & Related papers (2022-11-25T11:25:20Z) - Hierarchical Spherical CNNs with Lifting-based Adaptive Wavelets for
Pooling and Unpooling [101.72318949104627]
We propose a novel framework of hierarchical convolutional neural networks (HS-CNNs) with a lifting structure to learn adaptive spherical wavelets for pooling and unpooling.
LiftHS-CNN ensures a more efficient hierarchical feature learning for both image- and pixel-level tasks.
arXiv Detail & Related papers (2022-05-31T07:23:42Z) - Pooling Revisited: Your Receptive Field is Suboptimal [35.11562214480459]
The size and shape of the receptive field determine how the network aggregates local information.
We propose a simple yet effective Dynamically Optimized Pooling operation, referred to as DynOPool.
Our experiments show that the models equipped with the proposed learnable resizing module outperform the baseline networks on multiple datasets in image classification and semantic segmentation.
arXiv Detail & Related papers (2022-05-30T17:03:40Z) - AdaPool: Exponential Adaptive Pooling for Information-Retaining
Downsampling [82.08631594071656]
Pooling layers are essential building blocks of Convolutional Neural Networks (CNNs)
We propose an adaptive and exponentially weighted pooling method named adaPool.
We demonstrate how adaPool improves the preservation of detail through a range of tasks including image and video classification and object detection.
arXiv Detail & Related papers (2021-11-01T08:50:37Z) - Distribution Mismatch Correction for Improved Robustness in Deep Neural
Networks [86.42889611784855]
normalization methods increase the vulnerability with respect to noise and input corruptions.
We propose an unsupervised non-parametric distribution correction method that adapts the activation distribution of each layer.
In our experiments, we empirically show that the proposed method effectively reduces the impact of intense image corruptions.
arXiv Detail & Related papers (2021-10-05T11:36:25Z) - Ordinal Pooling [26.873004843826962]
Ordinal pooling rearranges elements of a pooling region in a sequence and assigns a different weight to each element based upon its order in the sequence.
Experiments suggest that it is advantageous for the networks to perform different types of pooling operations within a pooling layer.
arXiv Detail & Related papers (2021-09-03T14:33:02Z) - DeepSplit: Scalable Verification of Deep Neural Networks via Operator
Splitting [70.62923754433461]
Analyzing the worst-case performance of deep neural networks against input perturbations amounts to solving a large-scale non- optimization problem.
We propose a novel method that can directly solve a convex relaxation of the problem to high accuracy, by splitting it into smaller subproblems that often have analytical solutions.
arXiv Detail & Related papers (2021-06-16T20:43:49Z) - Refining activation downsampling with SoftPool [74.1840492087968]
Convolutional Neural Networks (CNNs) use pooling to decrease the size of activation maps.
We propose SoftPool: a fast and efficient method for exponentially weighted activation downsampling.
We show that SoftPool can retain more information in the reduced activation maps.
arXiv Detail & Related papers (2021-01-02T12:09:49Z) - Receptive Field Size Optimization with Continuous Time Pooling [0.0]
We will present an altered version of the most commonly applied method, maximum pooling, where pooling in theory is substituted by a continuous time differential equation.
We will evaluate the effect of continuous pooling on accuracy and computational need using commonly applied network architectures and datasets.
arXiv Detail & Related papers (2020-11-02T10:21:51Z) - A Flexible Framework for Designing Trainable Priors with Adaptive
Smoothing and Game Encoding [57.1077544780653]
We introduce a general framework for designing and training neural network layers whose forward passes can be interpreted as solving non-smooth convex optimization problems.
We focus on convex games, solved by local agents represented by the nodes of a graph and interacting through regularization functions.
This approach is appealing for solving imaging problems, as it allows the use of classical image priors within deep models that are trainable end to end.
arXiv Detail & Related papers (2020-06-26T08:34:54Z) - Multi Layer Neural Networks as Replacement for Pooling Operations [13.481518628796692]
We show that one perceptron can already be used effectively as a pooling operation without increasing the complexity of the model.
We compare our approach to tensor convolution with strides as a pooling operation and show that our approach is both effective and reduces complexity.
arXiv Detail & Related papers (2020-06-12T07:08:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.