Survey of Image Based Graph Neural Networks
- URL: http://arxiv.org/abs/2106.06307v1
- Date: Fri, 11 Jun 2021 10:56:43 GMT
- Title: Survey of Image Based Graph Neural Networks
- Authors: Usman Nazir, He Wang and Murtaza Taj
- Abstract summary: We first convert the image into superpixels using the Quickshift algorithm so as to reduce 30% of the input data.
The superpixels are subsequently used to generate a region adjacency graph.
The graph is passed through a state-of-art graph convolutional neural network to get classification scores.
- Score: 10.437582458089034
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this survey paper, we analyze image based graph neural networks and
propose a three-step classification approach. We first convert the image into
superpixels using the Quickshift algorithm so as to reduce 30% of the input
data. The superpixels are subsequently used to generate a region adjacency
graph. Finally, the graph is passed through a state-of-art graph convolutional
neural network to get classification scores. We also analyze the spatial and
spectral convolution filtering techniques in graph neural networks.
Spectral-based models perform better than spatial-based models and classical
CNN with lesser compute cost.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Graph Neural Networks for Image Classification and Reinforcement
Learning using Graph representations [15.256931959393803]
We will evaluate the performance of graph neural networks in two distinct domains: computer vision and reinforcement learning.
In the computer vision section, we seek to learn whether a novel non-redundant representation for images as graphs can improve performance over trivial pixel to node mapping on a graph-level prediction graph, specifically image classification.
For the reinforcement learning section, we seek to learn if explicitly modeling solving a Rubik's cube as a graph problem can improve performance over a standard model-free technique with no inductive bias.
arXiv Detail & Related papers (2022-03-07T15:16:31Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Variational models for signal processing with Graph Neural Networks [3.5939555573102853]
This paper is devoted to signal processing on point-clouds by means of neural networks.
In this work, we investigate the use of variational models for such Graph Neural Networks to process signals on graphs for unsupervised learning.
arXiv Detail & Related papers (2021-03-30T13:31:11Z) - Processing of incomplete images by (graph) convolutional neural networks [7.778461949427663]
We investigate the problem of training neural networks from incomplete images without replacing missing values.
We first represent an image as a graph, in which missing pixels are entirely ignored.
The graph image representation is processed using a spatial graph convolutional network.
arXiv Detail & Related papers (2020-10-26T21:40:03Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Analyzing Neural Networks Based on Random Graphs [77.34726150561087]
We perform a massive evaluation of neural networks with architectures corresponding to random graphs of various types.
We find that none of the classical numerical graph invariants by itself allows to single out the best networks.
We also find that networks with primarily short-range connections perform better than networks which allow for many long-range connections.
arXiv Detail & Related papers (2020-02-19T11:04:49Z) - Superpixel Image Classification with Graph Attention Networks [4.714325419968082]
This paper presents a methodology for image classification using Graph Neural Network (GNN) models.
We transform the input images into region adjacency graphs (RAGs), in which regions are superpixels and edges connect neighboring superpixels.
Experiments suggest that Graph Attention Networks (GATs), which combine graph convolutions with self-attention mechanisms, outperforms other GNN models.
arXiv Detail & Related papers (2020-02-13T14:52:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.