Learning on Attribute-Missing Graphs
- URL: http://arxiv.org/abs/2011.01623v1
- Date: Tue, 3 Nov 2020 11:09:52 GMT
- Title: Learning on Attribute-Missing Graphs
- Authors: Xu Chen and Siheng Chen and Jiangchao Yao and Huangjie Zheng and Ya
Zhang and Ivor W Tsang
- Abstract summary: There is a graph where attributes of only partial nodes could be available and those of the others might be entirely missing.
Existing graph learning methods including the popular GNN cannot provide satisfied learning performance.
We develop a novel distribution matching based GNN called structure-attribute transformer (SAT) for attribute-missing graphs.
- Score: 66.76561524848304
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs with complete node attributes have been widely explored recently.
While in practice, there is a graph where attributes of only partial nodes
could be available and those of the others might be entirely missing. This
attribute-missing graph is related to numerous real-world applications and
there are limited studies investigating the corresponding learning problems.
Existing graph learning methods including the popular GNN cannot provide
satisfied learning performance since they are not specified for
attribute-missing graphs. Thereby, designing a new GNN for these graphs is a
burning issue to the graph learning community. In this paper, we make a
shared-latent space assumption on graphs and develop a novel distribution
matching based GNN called structure-attribute transformer (SAT) for
attribute-missing graphs. SAT leverages structures and attributes in a
decoupled scheme and achieves the joint distribution modeling of structures and
attributes by distribution matching techniques. It could not only perform the
link prediction task but also the newly introduced node attribute completion
task. Furthermore, practical measures are introduced to quantify the
performance of node attribute completion. Extensive experiments on seven
real-world datasets indicate SAT shows better performance than other methods on
both link prediction and node attribute completion tasks. Codes and data are
available online: https://github.com/xuChenSJTU/SAT-master-online
Related papers
- Learning on Graphs with Out-of-Distribution Nodes [33.141867473074264]
Graph Neural Networks (GNNs) are state-of-the-art models for performing prediction tasks on graphs.
This work defines the problem of graph learning with out-of-distribution nodes.
We propose Out-of-Distribution Graph Attention Network (OODGAT), a novel GNN model which explicitly models the interaction between different kinds of nodes.
arXiv Detail & Related papers (2023-08-13T08:10:23Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - A Simple and Scalable Graph Neural Network for Large Directed Graphs [11.792826520370774]
We investigate various combinations of node representations and edge direction awareness within an input graph.
In response, we propose a simple yet holistic classification method A2DUG.
We demonstrate that A2DUG stably performs well on various datasets and improves the accuracy up to 11.29 compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-06-14T06:24:58Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Node Feature Extraction by Self-Supervised Multi-scale Neighborhood
Prediction [123.20238648121445]
We propose a new self-supervised learning framework, Graph Information Aided Node feature exTraction (GIANT)
GIANT makes use of the eXtreme Multi-label Classification (XMC) formalism, which is crucial for fine-tuning the language model based on graph information.
We demonstrate the superior performance of GIANT over the standard GNN pipeline on Open Graph Benchmark datasets.
arXiv Detail & Related papers (2021-10-29T19:55:12Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Incomplete Graph Representation and Learning via Partial Graph Neural
Networks [7.227805463462352]
In many applications, graph may be coming in an incomplete form where attributes of graph nodes are partially unknown/missing.
Existing GNNs are generally designed on complete graphs which can not deal with attribute-incomplete graph data directly.
We develop a novel partial aggregation based GNNs, named Partial Graph Neural Networks (PaGNNs) for attribute-incomplete graph representation and learning.
arXiv Detail & Related papers (2020-03-23T08:29:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.