A Continuous Convolutional Trainable Filter for Modelling Unstructured
Data
- URL: http://arxiv.org/abs/2210.13416v3
- Date: Thu, 25 May 2023 09:08:40 GMT
- Title: A Continuous Convolutional Trainable Filter for Modelling Unstructured
Data
- Authors: Dario Coscia, Laura Meneghetti, Nicola Demo, Giovanni Stabile,
Gianluigi Rozza
- Abstract summary: We propose a continuous version of a trainable convolutional filter able to work also with unstructured data.
Our experiments show that the continuous filter can achieve a level of accuracy comparable to the state-of-the-art discrete filter.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Convolutional Neural Network (CNN) is one of the most important architectures
in deep learning. The fundamental building block of a CNN is a trainable
filter, represented as a discrete grid, used to perform convolution on discrete
input data. In this work, we propose a continuous version of a trainable
convolutional filter able to work also with unstructured data. This new
framework allows exploring CNNs beyond discrete domains, enlarging the usage of
this important learning technique for many more complex problems. Our
experiments show that the continuous filter can achieve a level of accuracy
comparable to the state-of-the-art discrete filter, and that it can be used in
current deep learning architectures as a building block to solve problems with
unstructured domains as well.
Related papers
- PICNN: A Pathway towards Interpretable Convolutional Neural Networks [12.31424771480963]
We introduce a novel pathway to alleviate the entanglement between filters and image classes.
We use the Bernoulli sampling to generate the filter-cluster assignment matrix from a learnable filter-class correspondence matrix.
We evaluate the effectiveness of our method on ten widely used network architectures.
arXiv Detail & Related papers (2023-12-19T11:36:03Z) - As large as it gets: Learning infinitely large Filters via Neural Implicit Functions in the Fourier Domain [22.512062422338914]
Recent work in neural networks for image classification has seen a strong tendency towards increasing the spatial context.
We propose a module for studying the effective filter size of convolutional neural networks.
Our analysis shows that, although the proposed networks could learn very large convolution kernels, the learned filters are well localized and relatively small in practice.
arXiv Detail & Related papers (2023-07-19T14:21:11Z) - What Can Be Learnt With Wide Convolutional Neural Networks? [69.55323565255631]
We study infinitely-wide deep CNNs in the kernel regime.
We prove that deep CNNs adapt to the spatial scale of the target function.
We conclude by computing the generalisation error of a deep CNN trained on the output of another deep CNN.
arXiv Detail & Related papers (2022-08-01T17:19:32Z) - Towards a General Purpose CNN for Long Range Dependencies in
$\mathrm{N}$D [49.57261544331683]
We propose a single CNN architecture equipped with continuous convolutional kernels for tasks on arbitrary resolution, dimensionality and length without structural changes.
We show the generality of our approach by applying the same CCNN to a wide set of tasks on sequential (1$mathrmD$) and visual data (2$mathrmD$)
Our CCNN performs competitively and often outperforms the current state-of-the-art across all tasks considered.
arXiv Detail & Related papers (2022-06-07T15:48:02Z) - CNN Filter DB: An Empirical Investigation of Trained Convolutional
Filters [2.0305676256390934]
We show that model pre-training can succeed on arbitrary datasets if they meet size and variance conditions.
We show that many pre-trained models contain degenerated filters which make them less robust and less suitable for fine-tuning on target applications.
arXiv Detail & Related papers (2022-03-29T08:25:42Z) - FILTRA: Rethinking Steerable CNN by Filter Transform [59.412570807426135]
The problem of steerable CNN has been studied from aspect of group representation theory.
We show that kernel constructed by filter transform can also be interpreted in the group representation theory.
This interpretation help complete the puzzle of steerable CNN theory and provides a novel and simple approach to implement steerable convolution operators.
arXiv Detail & Related papers (2021-05-25T03:32:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Deep Parametric Continuous Convolutional Neural Networks [92.87547731907176]
Parametric Continuous Convolution is a new learnable operator that operates over non-grid structured data.
Our experiments show significant improvement over the state-of-the-art in point cloud segmentation of indoor and outdoor scenes.
arXiv Detail & Related papers (2021-01-17T18:28:23Z) - Training Interpretable Convolutional Neural Networks by Differentiating
Class-specific Filters [64.46270549587004]
Convolutional neural networks (CNNs) have been successfully used in a range of tasks.
CNNs are often viewed as "black-box" and lack of interpretability.
We propose a novel strategy to train interpretable CNNs by encouraging class-specific filters.
arXiv Detail & Related papers (2020-07-16T09:12:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.