Interpreting and Unifying Graph Neural Networks with An Optimization
Framework
- URL: http://arxiv.org/abs/2101.11859v1
- Date: Thu, 28 Jan 2021 08:06:02 GMT
- Title: Interpreting and Unifying Graph Neural Networks with An Optimization
Framework
- Authors: Meiqi Zhu, Xiao Wang, Chuan Shi, Houye Ji, Peng Cui
- Abstract summary: Graph Neural Networks (GNNs) have received considerable attention on graph-structured data learning.
In this paper, we establish a surprising connection between different propagation mechanisms with a unified optimization problem.
Our proposed unified optimization framework, summarizing the commonalities between several of the most representative GNNs, opens up new opportunities for flexibly designing new GNNs.
- Score: 47.44773358082203
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have received considerable attention on
graph-structured data learning for a wide variety of tasks. The well-designed
propagation mechanism which has been demonstrated effective is the most
fundamental part of GNNs. Although most of GNNs basically follow a message
passing manner, litter effort has been made to discover and analyze their
essential relations. In this paper, we establish a surprising connection
between different propagation mechanisms with a unified optimization problem,
showing that despite the proliferation of various GNNs, in fact, their proposed
propagation mechanisms are the optimal solution optimizing a feature fitting
function over a wide class of graph kernels with a graph regularization term.
Our proposed unified optimization framework, summarizing the commonalities
between several of the most representative GNNs, not only provides a
macroscopic view on surveying the relations between different GNNs, but also
further opens up new opportunities for flexibly designing new GNNs. With the
proposed framework, we discover that existing works usually utilize naive graph
convolutional kernels for feature fitting function, and we further develop two
novel objective functions considering adjustable graph kernels showing low-pass
or high-pass filtering capabilities respectively. Moreover, we provide the
convergence proofs and expressive power comparisons for the proposed models.
Extensive experiments on benchmark datasets clearly show that the proposed GNNs
not only outperform the state-of-the-art methods but also have good ability to
alleviate over-smoothing, and further verify the feasibility for designing GNNs
with our unified optimization framework.
Related papers
- Self-supervision meets kernel graph neural models: From architecture to
augmentations [36.388069423383286]
We improve the design and learning of kernel graph neural networks (KGNNs)
We develop a novel structure-preserving graph data augmentation method called latent graph augmentation (LGA)
Our proposed model achieves competitive performance comparable to or sometimes outperforming state-of-the-art graph representation learning frameworks.
arXiv Detail & Related papers (2023-10-17T14:04:22Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Towards Better Generalization with Flexible Representation of
Multi-Module Graph Neural Networks [0.27195102129094995]
We use a random graph generator to investigate how the graph size and structural properties affect the predictive performance of GNNs.
We present specific evidence that the average node degree is a key feature in determining whether GNNs can generalize to unseen graphs.
We propose a multi- module GNN framework that allows the network to adapt flexibly to new graphs by generalizing a single canonical nonlinear transformation over aggregated inputs.
arXiv Detail & Related papers (2022-09-14T12:13:59Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Ensemble Multi-Relational Graph Neural Networks [18.96097003317416]
We propose a novel ensemble multi-relational GNN by designing an ensemble multi-relational (EMR) optimization objective.
This EMR optimization objective is able to derive an iterative updating rule, which can be formalized as an ensemble message passing layer with multi-relations.
Extensive experiments conducted on four benchmark datasets well demonstrate the effectiveness of the proposed model.
arXiv Detail & Related papers (2022-05-24T13:52:41Z) - GPN: A Joint Structural Learning Framework for Graph Neural Networks [36.38529113603987]
We propose a GNN-based joint learning framework that simultaneously learns the graph structure and the downstream task.
Our method is the first GNN-based bilevel optimization framework for resolving this task.
arXiv Detail & Related papers (2022-05-12T09:06:04Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.