Ordinal Pooling
- URL: http://arxiv.org/abs/2109.01561v1
- Date: Fri, 3 Sep 2021 14:33:02 GMT
- Title: Ordinal Pooling
- Authors: Adrien Deli\`ege, Maxime Istasse, Ashwani Kumar, Christophe De
Vleeschouwer, Marc Van Droogenbroeck
- Abstract summary: Ordinal pooling rearranges elements of a pooling region in a sequence and assigns a different weight to each element based upon its order in the sequence.
Experiments suggest that it is advantageous for the networks to perform different types of pooling operations within a pooling layer.
- Score: 26.873004843826962
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the framework of convolutional neural networks, downsampling is often
performed with an average-pooling, where all the activations are treated
equally, or with a max-pooling operation that only retains an element with
maximum activation while discarding the others. Both of these operations are
restrictive and have previously been shown to be sub-optimal. To address this
issue, a novel pooling scheme, named\emph{ ordinal pooling}, is introduced in
this work. Ordinal pooling rearranges all the elements of a pooling region in a
sequence and assigns a different weight to each element based upon its order in
the sequence. These weights are used to compute the pooling operation as a
weighted sum of the rearranged elements of the pooling region. They are learned
via a standard gradient-based training, allowing to learn a behavior anywhere
in the spectrum of average-pooling to max-pooling in a differentiable manner.
Our experiments suggest that it is advantageous for the networks to perform
different types of pooling operations within a pooling layer and that a hybrid
behavior between average- and max-pooling is often beneficial. More
importantly, they also demonstrate that ordinal pooling leads to consistent
improvements in the accuracy over average- or max-pooling operations while
speeding up the training and alleviating the issue of the choice of the pooling
operations and activation functions to be used in the networks. In particular,
ordinal pooling mainly helps on lightweight or quantized deep learning
architectures, as typically considered e.g. for embedded applications.
Related papers
- MorphPool: Efficient Non-linear Pooling & Unpooling in CNNs [9.656707333320037]
Pooling is essentially an operation from the field of Mathematical Morphology, with max pooling as a limited special case.
In addition to pooling operations, encoder-decoder networks used for pixel-level predictions also require unpooling.
Extensive experimentation on two tasks and three large-scale datasets shows that morphological pooling and unpooling lead to improved predictive performance at much reduced parameter counts.
arXiv Detail & Related papers (2022-11-25T11:25:20Z) - Hierarchical Spherical CNNs with Lifting-based Adaptive Wavelets for
Pooling and Unpooling [101.72318949104627]
We propose a novel framework of hierarchical convolutional neural networks (HS-CNNs) with a lifting structure to learn adaptive spherical wavelets for pooling and unpooling.
LiftHS-CNN ensures a more efficient hierarchical feature learning for both image- and pixel-level tasks.
arXiv Detail & Related papers (2022-05-31T07:23:42Z) - AdaPool: Exponential Adaptive Pooling for Information-Retaining
Downsampling [82.08631594071656]
Pooling layers are essential building blocks of Convolutional Neural Networks (CNNs)
We propose an adaptive and exponentially weighted pooling method named adaPool.
We demonstrate how adaPool improves the preservation of detail through a range of tasks including image and video classification and object detection.
arXiv Detail & Related papers (2021-11-01T08:50:37Z) - Compressing Deep ODE-Nets using Basis Function Expansions [105.05435207079759]
We consider formulations of the weights as continuous-depth functions using linear combinations of basis functions.
This perspective allows us to compress the weights through a change of basis, without retraining, while maintaining near state-of-the-art performance.
In turn, both inference time and the memory footprint are reduced, enabling quick and rigorous adaptation between computational environments.
arXiv Detail & Related papers (2021-06-21T03:04:51Z) - Refining activation downsampling with SoftPool [74.1840492087968]
Convolutional Neural Networks (CNNs) use pooling to decrease the size of activation maps.
We propose SoftPool: a fast and efficient method for exponentially weighted activation downsampling.
We show that SoftPool can retain more information in the reduced activation maps.
arXiv Detail & Related papers (2021-01-02T12:09:49Z) - Self Normalizing Flows [65.73510214694987]
We propose a flexible framework for training normalizing flows by replacing expensive terms in the gradient by learned approximate inverses at each layer.
This reduces the computational complexity of each layer's exact update from $mathcalO(D3)$ to $mathcalO(D2)$.
We show experimentally that such models are remarkably stable and optimize to similar data likelihood values as their exact gradient counterparts.
arXiv Detail & Related papers (2020-11-14T09:51:51Z) - Receptive Field Size Optimization with Continuous Time Pooling [0.0]
We will present an altered version of the most commonly applied method, maximum pooling, where pooling in theory is substituted by a continuous time differential equation.
We will evaluate the effect of continuous pooling on accuracy and computational need using commonly applied network architectures and datasets.
arXiv Detail & Related papers (2020-11-02T10:21:51Z) - Multi Layer Neural Networks as Replacement for Pooling Operations [13.481518628796692]
We show that one perceptron can already be used effectively as a pooling operation without increasing the complexity of the model.
We compare our approach to tensor convolution with strides as a pooling operation and show that our approach is both effective and reduces complexity.
arXiv Detail & Related papers (2020-06-12T07:08:38Z) - Regularized Pooling [12.387676601792899]
In convolutional neural networks (CNNs), pooling operations play important roles such as dimensionality reduction and deformation compensation.
We propose regularized pooling, which enables the value selection direction in the pooling operation to be spatially smooth across adjacent kernels.
arXiv Detail & Related papers (2020-05-06T09:02:17Z) - Strip Pooling: Rethinking Spatial Pooling for Scene Parsing [161.7521770950933]
We introduce strip pooling, which considers a long but narrow kernel, i.e., 1xN or Nx1.
We compare the performance of the proposed strip pooling and conventional spatial pooling techniques.
Both novel pooling-based designs are lightweight and can serve as an efficient plug-and-play module in existing scene parsing networks.
arXiv Detail & Related papers (2020-03-30T10:40:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.