Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification
- URL: http://arxiv.org/abs/2107.13059v1
- Date: Tue, 27 Jul 2021 19:47:53 GMT
- Title: Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification
- Authors: Yu Wang, Yuesong Shen, Daniel Cremers
- Abstract summary: We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
- Score: 59.06717774425588
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Node features and structural information of a graph are both crucial for
semi-supervised node classification problems. A variety of graph neural network
(GNN) based approaches have been proposed to tackle these problems, which
typically determine output labels through feature aggregation. This can be
problematic, as it implies conditional independence of output nodes given
hidden representations, despite their direct connections in the graph. To learn
the direct influence among output nodes in a graph, we propose the Explicit
Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph
as a partially observed Markov Random Field. It contains explicit pairwise
factors to model output-output relations and uses a GNN backbone to model
input-output relations. To balance model complexity and expressivity, the
pairwise factors have a shared component and a separate scaling coefficient for
each edge. We apply the EM algorithm to train our model, and utilize a
star-shaped piecewise likelihood for the tractable surrogate objective. We
conduct experiments on various datasets, which shows that our model can
effectively improve the performance for semi-supervised node classification on
graphs.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - High-Order Pooling for Graph Neural Networks with Tensor Decomposition [23.244580796300166]
Graph Neural Networks (GNNs) are attracting growing attention due to their effectiveness and flexibility in modeling a variety of graph-structured data.
We propose the Graphized Neural Network (tGNN), a highly expressive GNN architecture relying on tensor decomposition to model high-order non-linear node interactions.
arXiv Detail & Related papers (2022-05-24T01:12:54Z) - Meta-Weight Graph Neural Network: Push the Limits Beyond Global
Homophily [24.408557217909316]
Graph Neural Networks (GNNs) show strong expressive power on graph data mining.
However, not all graphs are homophilic, even in the same graph, the distributions may vary significantly.
We propose Meta Weight Graph Neural Network (MWGNN) to adaptively construct graph convolution layers for different nodes.
arXiv Detail & Related papers (2022-03-19T09:27:38Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - GAIN: Graph Attention & Interaction Network for Inductive
Semi-Supervised Learning over Large-scale Graphs [18.23435958000212]
Graph Neural Networks (GNNs) have led to state-of-the-art performance on a variety of machine learning tasks such as recommendation, node classification and link prediction.
Most existing GNN models exploit a single type of aggregator to aggregate neighboring nodes information.
We propose a novel graph neural network architecture, Graph Attention & Interaction Network (GAIN), for inductive learning on graphs.
arXiv Detail & Related papers (2020-11-03T00:20:24Z) - CopulaGNN: Towards Integrating Representational and Correlational Roles
of Graphs in Graph Neural Networks [23.115288017590093]
We investigate how Graph Neural Network (GNN) models can effectively leverage both types of information.
The proposed Copula Graph Neural Network (CopulaGNN) can take a wide range of GNN models as base models.
arXiv Detail & Related papers (2020-10-05T15:20:04Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.