Graph Neural Networks in Computer Vision -- Architectures, Datasets and
Common Approaches
- URL: http://arxiv.org/abs/2212.10207v1
- Date: Tue, 20 Dec 2022 12:40:29 GMT
- Title: Graph Neural Networks in Computer Vision -- Architectures, Datasets and
Common Approaches
- Authors: Maciej Krzywda, Szymon {\L}ukasik, Amir H. Gandomi
- Abstract summary: Graph Neural Networks (GNNs) are a family of graph networks inspired by mechanisms existing between nodes on a graph.
This contribution aims to collect papers published about GNN-based approaches towards computer vision.
- Score: 10.60034824788636
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Graph Neural Networks (GNNs) are a family of graph networks inspired by
mechanisms existing between nodes on a graph. In recent years there has been an
increased interest in GNN and their derivatives, i.e., Graph Attention Networks
(GAT), Graph Convolutional Networks (GCN), and Graph Recurrent Networks (GRN).
An increase in their usability in computer vision is also observed. The number
of GNN applications in this field continues to expand; it includes video
analysis and understanding, action and behavior recognition, computational
photography, image and video synthesis from zero or few shots, and many more.
This contribution aims to collect papers published about GNN-based approaches
towards computer vision. They are described and summarized from three
perspectives. Firstly, we investigate the architectures of Graph Neural
Networks and their derivatives used in this area to provide accurate and
explainable recommendations for the ensuing investigations. As for the other
aspect, we also present datasets used in these works. Finally, using graph
analysis, we also examine relations between GNN-based studies in computer
vision and potential sources of inspiration identified outside of this field.
Related papers
- Graphs Unveiled: Graph Neural Networks and Graph Generation [0.0]
This paper provides a comprehensive overview of Graph Neural Networks (GNNs)
We discuss the applications of graph neural networks across various domains.
We present an advanced field in GNNs: graph generation.
arXiv Detail & Related papers (2024-03-18T14:37:27Z) - A Study on Knowledge Graph Embeddings and Graph Neural Networks for Web
Of Things [0.0]
In the future, Orange's take on a knowledge graph in the domain of the Web Of Things (WoT) is to provide a digital representation of the physical world.
In this paper, we explore state-of-the-art knowledge graph embedding (KGE) methods to learn numerical representations of the graph entities.
We also investigate Graph neural networks (GNN) alongside KGEs and compare their performance on the same downstream tasks.
arXiv Detail & Related papers (2023-10-23T12:36:33Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Theory of Graph Neural Networks: Representation and Learning [44.02161831977037]
Graph Neural Networks (GNNs) have become a popular learning model for prediction tasks on nodes, graphs and configurations of points.
This article summarizes a selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and higher-order GNNs.
arXiv Detail & Related papers (2022-04-16T02:08:50Z) - Visualizing Graph Neural Networks with CorGIE: Corresponding a Graph to
Its Embedding [16.80197065484465]
We propose an approach to corresponding an input graph to its node embedding (aka latent space)
We develop an interactive multi-view interface called CorGIE to instantiate the abstraction.
We present how to use CorGIE in two usage scenarios, and conduct a case study with two GNN experts.
arXiv Detail & Related papers (2021-06-24T08:59:53Z) - Computing Graph Neural Networks: A Survey from Algorithms to
Accelerators [2.491032752533246]
Graph Neural Networks (GNNs) have exploded onto the machine learning scene in recent years owing to their capability to model and learn from graph-structured data.
This paper aims to make two main contributions: a review of the field of GNNs is presented from the perspective of computing.
An in-depth analysis of current software and hardware acceleration schemes is provided.
arXiv Detail & Related papers (2020-09-30T22:29:27Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.