Rotation-Scale Equivariant Steerable Filters
- URL: http://arxiv.org/abs/2304.04600v1
- Date: Mon, 10 Apr 2023 14:13:56 GMT
- Title: Rotation-Scale Equivariant Steerable Filters
- Authors: Yilong Yang, Srinandan Dasmahapatra, Sasan Mahmoodi
- Abstract summary: Digital histology imaging of biopsy tissue can be captured at arbitrary orientation and magnification and stored at different resolutions.
We propose the Rotation-Scale Equivariant Steerable Filter (RSESF), which incorporates steerable filters and scale-space theory.
Our method outperforms other approaches, with much fewer trainable parameters and fewer GPU resources required.
- Score: 1.213915839836187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Incorporating either rotation equivariance or scale equivariance into CNNs
has proved to be effective in improving models' generalization performance.
However, jointly integrating rotation and scale equivariance into CNNs has not
been widely explored. Digital histology imaging of biopsy tissue can be
captured at arbitrary orientation and magnification and stored at different
resolutions, resulting in cells appearing in different scales. When
conventional CNNs are applied to histopathology image analysis, the
generalization performance of models is limited because 1) a part of the
parameters of filters are trained to fit rotation transformation, thus
decreasing the capability of learning other discriminative features; 2)
fixed-size filters trained on images at a given scale fail to generalize to
those at different scales. To deal with these issues, we propose the
Rotation-Scale Equivariant Steerable Filter (RSESF), which incorporates
steerable filters and scale-space theory. The RSESF contains copies of filters
that are linear combinations of Gaussian filters, whose direction is controlled
by directional derivatives and whose scale parameters are trainable but
constrained to span disjoint scales in successive layers of the network.
Extensive experiments on two gland segmentation datasets demonstrate that our
method outperforms other approaches, with much fewer trainable parameters and
fewer GPU resources required. The source code is available at:
https://github.com/ynulonger/RSESF.
Related papers
- As large as it gets: Learning infinitely large Filters via Neural Implicit Functions in the Fourier Domain [22.512062422338914]
Recent work in neural networks for image classification has seen a strong tendency towards increasing the spatial context.
We propose a module for studying the effective filter size of convolutional neural networks.
Our analysis shows that, although the proposed networks could learn very large convolution kernels, the learned filters are well localized and relatively small in practice.
arXiv Detail & Related papers (2023-07-19T14:21:11Z) - Sorted Convolutional Network for Achieving Continuous Rotational
Invariance [56.42518353373004]
We propose a Sorting Convolution (SC) inspired by some hand-crafted features of texture images.
SC achieves continuous rotational invariance without requiring additional learnable parameters or data augmentation.
Our results demonstrate that SC achieves the best performance in the aforementioned tasks.
arXiv Detail & Related papers (2023-05-23T18:37:07Z) - Scale-Equivariant UNet for Histopathology Image Segmentation [1.213915839836187]
Convolutional Neural Networks (CNNs) trained on such images at a given scale fail to generalise to those at different scales.
We propose the Scale-Equivariant UNet (SEUNet) for image segmentation by building on scale-space theory.
arXiv Detail & Related papers (2023-04-10T14:03:08Z) - Effective Invertible Arbitrary Image Rescaling [77.46732646918936]
Invertible Neural Networks (INN) are able to increase upscaling accuracy significantly by optimizing the downscaling and upscaling cycle jointly.
A simple and effective invertible arbitrary rescaling network (IARN) is proposed to achieve arbitrary image rescaling by training only one model in this work.
It is shown to achieve a state-of-the-art (SOTA) performance in bidirectional arbitrary rescaling without compromising perceptual quality in LR outputs.
arXiv Detail & Related papers (2022-09-26T22:22:30Z) - Implicit Equivariance in Convolutional Networks [1.911678487931003]
Implicitly Equivariant Networks (IEN) induce equivariant in the different layers of a standard CNN model.
We show IEN outperforms the state-of-the-art rotation equivariant tracking method while providing faster inference speed.
arXiv Detail & Related papers (2021-11-28T14:44:17Z) - FILTRA: Rethinking Steerable CNN by Filter Transform [59.412570807426135]
The problem of steerable CNN has been studied from aspect of group representation theory.
We show that kernel constructed by filter transform can also be interpreted in the group representation theory.
This interpretation help complete the puzzle of steerable CNN theory and provides a novel and simple approach to implement steerable convolution operators.
arXiv Detail & Related papers (2021-05-25T03:32:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - PSConv: Squeezing Feature Pyramid into One Compact Poly-Scale
Convolutional Layer [76.44375136492827]
Convolutional Neural Networks (CNNs) are often scale-sensitive.
We bridge this regret by exploiting multi-scale features in a finer granularity.
The proposed convolution operation, named Poly-Scale Convolution (PSConv), mixes up a spectrum of dilation rates.
arXiv Detail & Related papers (2020-07-13T05:14:11Z) - Dense Steerable Filter CNNs for Exploiting Rotational Symmetry in
Histology Images [3.053417311299492]
Histology images are inherently symmetric under rotation, where each orientation is equally as likely to appear.
Dense Steerable Filter CNNs (DSF-CNNs) use group convolutions with multiple rotated copies of each filter in a densely connected framework.
We show that DSF-CNNs achieve state-of-the-art performance, with significantly fewer parameters, when applied to three different tasks in the area of pathology computational.
arXiv Detail & Related papers (2020-04-06T23:12:31Z) - Embedding Propagation: Smoother Manifold for Few-Shot Classification [131.81692677836202]
We propose to use embedding propagation as an unsupervised non-parametric regularizer for manifold smoothing in few-shot classification.
We empirically show that embedding propagation yields a smoother embedding manifold.
We show that embedding propagation consistently improves the accuracy of the models in multiple semi-supervised learning scenarios by up to 16% points.
arXiv Detail & Related papers (2020-03-09T13:51:09Z) - DCT-Conv: Coding filters in convolutional networks with Discrete Cosine
Transform [0.0]
We analyze how switching off selected components of the spectra, thereby reducing the number of trained weights of the network, affects its performance.
Experiments show that coding the filters with trained DCT parameters leads to improvement over traditional convolution.
arXiv Detail & Related papers (2020-01-23T13:58:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.