SelectionConv: Convolutional Neural Networks for Non-rectilinear Image
Data
- URL: http://arxiv.org/abs/2207.08979v1
- Date: Mon, 18 Jul 2022 23:20:50 GMT
- Title: SelectionConv: Convolutional Neural Networks for Non-rectilinear Image
Data
- Authors: David Hart, Michael Whitney, Bryan Morse
- Abstract summary: We introduce a new structured graph convolution operator that can copy 2D convolution weights.
This network can then operate on any data that can be represented as a positional graph.
Results of transferring pre-trained image networks for segmentation, stylization, and depth prediction are demonstrated.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional Neural Networks have revolutionized vision applications. There
are image domains and representations, however, that cannot be handled by
standard CNNs (e.g., spherical images, superpixels). Such data are usually
processed using networks and algorithms specialized for each type. In this
work, we show that it may not always be necessary to use specialized neural
networks to operate on such spaces. Instead, we introduce a new structured
graph convolution operator that can copy 2D convolution weights, transferring
the capabilities of already trained traditional CNNs to our new graph network.
This network can then operate on any data that can be represented as a
positional graph. By converting non-rectilinear data to a graph, we can apply
these convolutions on these irregular image domains without requiring training
on large domain-specific datasets. Results of transferring pre-trained image
networks for segmentation, stylization, and depth prediction are demonstrated
for a variety of such data forms.
Related papers
- CNN2GNN: How to Bridge CNN with GNN [59.42117676779735]
We propose a novel CNN2GNN framework to unify CNN and GNN together via distillation.
The performance of distilled boosted'' two-layer GNN on Mini-ImageNet is much higher than CNN containing dozens of layers such as ResNet152.
arXiv Detail & Related papers (2024-04-23T08:19:08Z) - Training Convolutional Neural Networks with the Forward-Forward
algorithm [1.74440662023704]
Forward Forward (FF) algorithm has up to now only been used in fully connected networks.
We show how the FF paradigm can be extended to CNNs.
Our FF-trained CNN, featuring a novel spatially-extended labeling technique, achieves a classification accuracy of 99.16% on the MNIST hand-written digits dataset.
arXiv Detail & Related papers (2023-12-22T18:56:35Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Hyperbolic Convolutional Neural Networks [14.35618845900589]
Using non-Euclidean space for embedding data might result in more robust and explainable models.
We hypothesize that ability of hyperbolic space to capture hierarchy in the data would lead to better performance.
arXiv Detail & Related papers (2023-08-29T21:20:16Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Variational models for signal processing with Graph Neural Networks [3.5939555573102853]
This paper is devoted to signal processing on point-clouds by means of neural networks.
In this work, we investigate the use of variational models for such Graph Neural Networks to process signals on graphs for unsupervised learning.
arXiv Detail & Related papers (2021-03-30T13:31:11Z) - Processing of incomplete images by (graph) convolutional neural networks [7.778461949427663]
We investigate the problem of training neural networks from incomplete images without replacing missing values.
We first represent an image as a graph, in which missing pixels are entirely ignored.
The graph image representation is processed using a spatial graph convolutional network.
arXiv Detail & Related papers (2020-10-26T21:40:03Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Test-Time Adaptable Neural Networks for Robust Medical Image
Segmentation [9.372152932156293]
Convolutional Neural Networks (CNNs) work very well for supervised learning problems.
In medical image segmentation, this premise is violated when there is a mismatch between training and test images in terms of their acquisition details.
We design the segmentation CNN as a concatenation of two sub-networks: a relatively shallow image normalization CNN, followed by a deep CNN that segments the normalized image.
arXiv Detail & Related papers (2020-04-09T16:57:27Z) - Analyzing Neural Networks Based on Random Graphs [77.34726150561087]
We perform a massive evaluation of neural networks with architectures corresponding to random graphs of various types.
We find that none of the classical numerical graph invariants by itself allows to single out the best networks.
We also find that networks with primarily short-range connections perform better than networks which allow for many long-range connections.
arXiv Detail & Related papers (2020-02-19T11:04:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.