EdgeNets:Edge Varying Graph Neural Networks
- URL: http://arxiv.org/abs/2001.07620v3
- Date: Tue, 27 Jul 2021 14:02:26 GMT
- Title: EdgeNets:Edge Varying Graph Neural Networks
- Authors: Elvin Isufi, Fernando Gama, Alejandro Ribeiro
- Abstract summary: This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
- Score: 179.99395949679547
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Driven by the outstanding performance of neural networks in the structured
Euclidean domain, recent years have seen a surge of interest in developing
neural networks for graphs and data supported on graphs. The graph is leveraged
at each layer of the neural network as a parameterization to capture detail at
the node level with a reduced number of parameters and computational
complexity. Following this rationale, this paper puts forth a general framework
that unifies state-of-the-art graph neural networks (GNNs) through the concept
of EdgeNet. An EdgeNet is a GNN architecture that allows different nodes to use
different parameters to weigh the information of different neighbors. By
extrapolating this strategy to more iterations between neighboring nodes, the
EdgeNet learns edge- and neighbor-dependent weights to capture local detail.
This is a general linear and local operation that a node can perform and
encompasses under one formulation all existing graph convolutional neural
networks (GCNNs) as well as graph attention networks (GATs). In writing
different GNN architectures with a common language, EdgeNets highlight specific
architecture advantages and limitations, while providing guidelines to improve
their capacity without compromising their local implementation. An interesting
conclusion is the unification of GCNNs and GATs -- approaches that have been so
far perceived as separate. In particular, we show that GATs are GCNNs on a
graph that is learned from the features. This particularization opens the doors
to develop alternative attention mechanisms for improving discriminatory power.
Related papers
- Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Neo-GNNs: Neighborhood Overlap-aware Graph Neural Networks for Link
Prediction [23.545059901853815]
Graph Neural Networks (GNNs) have been widely applied to various fields for learning over graphstructured data.
We propose Neighborhood Overlap-aware Graph Neural Networks (Neo-GNNs) that learn useful structural features from an adjacency overlapped neighborhoods for link prediction.
arXiv Detail & Related papers (2022-06-09T01:43:49Z) - HPGNN: Using Hierarchical Graph Neural Networks for Outdoor Point Cloud
Processing [0.7649716717097428]
Motivated by recent improvements in point cloud processing for autonomous navigation, we focus on using hierarchical graph neural networks for processing.
We propose Hierarchical Point Graph Neural Network (HPGNN)
It learns node features at various levels of graph coarseness to extract information.
This enables to learn over a large point cloud while retaining fine details that existing point-level graph networks struggle to achieve.
arXiv Detail & Related papers (2022-06-05T11:18:09Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Edgeless-GNN: Unsupervised Inductive Edgeless Network Embedding [7.391641422048645]
We study the problem of embedding edgeless nodes such as users who newly enter the underlying network.
We propose Edgeless-GNN, a new framework that enables GNNs to generate node embeddings even for edgeless nodes through unsupervised inductive learning.
arXiv Detail & Related papers (2021-04-12T06:37:31Z) - Enhance Information Propagation for Graph Neural Network by
Heterogeneous Aggregations [7.3136594018091134]
Graph neural networks are emerging as continuation of deep learning success w.r.t. graph data.
We propose to enhance information propagation among GNN layers by combining heterogeneous aggregations.
We empirically validate the effectiveness of HAG-Net on a number of graph classification benchmarks.
arXiv Detail & Related papers (2021-02-08T08:57:56Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.