Joint Use of Node Attributes and Proximity for Semi-Supervised
Classification on Graphs
- URL: http://arxiv.org/abs/2010.11536v2
- Date: Tue, 14 Sep 2021 10:04:42 GMT
- Title: Joint Use of Node Attributes and Proximity for Semi-Supervised
Classification on Graphs
- Authors: Arpit Merchant, Michael Mathioudakis
- Abstract summary: We propose a principled approach, JANE, based on a generative probabilistic model that jointly weighs the role of attributes and node proximity.
Our experiments on a variety of network datasets demonstrate that JANE exhibits the desired combination of versatility and competitive performance.
- Score: 2.113059435430681
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The task of node classification is to infer unknown node labels, given the
labels for some of the nodes along with the network structure and other node
attributes. Typically, approaches for this task assume homophily, whereby
neighboring nodes have similar attributes and a node's label can be predicted
from the labels of its neighbors or other proximate (i.e., nearby) nodes in the
network. However, such an assumption may not always hold -- in fact, there are
cases where labels are better predicted from the individual attributes of each
node rather than the labels of its proximate nodes. Ideally, node
classification methods should flexibly adapt to a range of settings wherein
unknown labels are predicted either from labels of proximate nodes, or
individual node attributes, or partly both. In this paper, we propose a
principled approach, JANE, based on a generative probabilistic model that
jointly weighs the role of attributes and node proximity via embeddings in
predicting labels. Our experiments on a variety of network datasets demonstrate
that JANE exhibits the desired combination of versatility and competitive
performance compared to standard baselines.
Related papers
- Contrastive Meta-Learning for Few-shot Node Classification [54.36506013228169]
Few-shot node classification aims to predict labels for nodes on graphs with only limited labeled nodes as references.
We create a novel contrastive meta-learning framework on graphs, named COSMIC, with two key designs.
arXiv Detail & Related papers (2023-06-27T02:22:45Z) - Refined Edge Usage of Graph Neural Networks for Edge Prediction [51.06557652109059]
We propose a novel edge prediction paradigm named Edge-aware Message PassIng neuRal nEtworks (EMPIRE)
We first introduce an edge splitting technique to specify use of each edge where each edge is solely used as either the topology or the supervision.
In order to emphasize the differences between pairs connected by supervision edges and pairs unconnected, we further weight the messages to highlight the relative ones that can reflect the differences.
arXiv Detail & Related papers (2022-12-25T23:19:56Z) - Label-Enhanced Graph Neural Network for Semi-supervised Node
Classification [32.64730237473914]
We present a label-enhanced learning framework for Graph Neural Networks (GNNs)
It first models each label as a virtual center for intra-class nodes and then jointly learns the representations of both nodes and labels.
Our approach could not only smooth the representations of nodes belonging to the same class, but also explicitly encode the label semantics into the learning process of GNNs.
arXiv Detail & Related papers (2022-05-31T09:48:47Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Structure-Aware Label Smoothing for Graph Neural Networks [39.97741949184259]
Representing a label distribution as a one-hot vector is a common practice in training node classification models.
We propose a novel SALS (textitStructure-Aware Label Smoothing) method as an enhancement component to popular node classification models.
arXiv Detail & Related papers (2021-12-01T13:48:58Z) - Dynamic Labeling for Unlabeled Graph Neural Networks [34.65037955481084]
Graph neural networks (GNNs) rely on node embeddings to represent a node as a vector by its identity, type, or content.
Existing GNNs either assign random labels to nodes or assign one embedding to all nodes, which fails to distinguish one node from another.
In this paper, we analyze the limitation of existing approaches in two types of classification tasks, graph classification and node classification.
arXiv Detail & Related papers (2021-02-23T04:30:35Z) - Label-Consistency based Graph Neural Networks for Semi-supervised Node
Classification [47.753422069515366]
Graph neural networks (GNNs) achieve remarkable success in graph-based semi-supervised node classification.
In this paper, we propose label-consistency based graph neural network(LC-GNN), leveraging node pairs unconnected but with the same labels to enlarge the receptive field of nodes in GNNs.
Experiments on benchmark datasets demonstrate the proposed LC-GNN outperforms traditional GNNs in graph-based semi-supervised node classification.
arXiv Detail & Related papers (2020-07-27T11:17:46Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.