Few-Shot Learning on Graphs via Super-Classes based on Graph Spectral
Measures
- URL: http://arxiv.org/abs/2002.12815v1
- Date: Thu, 27 Feb 2020 17:11:14 GMT
- Title: Few-Shot Learning on Graphs via Super-Classes based on Graph Spectral
Measures
- Authors: Jatin Chauhan, Deepak Nathani, Manohar Kaul
- Abstract summary: We study the problem of few shot graph classification in graph neural networks (GNNs) to recognize unseen classes, given limited labeled graph examples.
We propose an approach where a probability measure is assigned to each graph based on the spectrum of the graphs normalized Laplacian.
- Score: 14.932318540666545
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose to study the problem of few shot graph classification in graph
neural networks (GNNs) to recognize unseen classes, given limited labeled graph
examples. Despite several interesting GNN variants being proposed recently for
node and graph classification tasks, when faced with scarce labeled examples in
the few shot setting, these GNNs exhibit significant loss in classification
performance. Here, we present an approach where a probability measure is
assigned to each graph based on the spectrum of the graphs normalized
Laplacian. This enables us to accordingly cluster the graph base labels
associated with each graph into super classes, where the Lp Wasserstein
distance serves as our underlying distance metric. Subsequently, a super graph
constructed based on the super classes is then fed to our proposed GNN
framework which exploits the latent inter class relationships made explicit by
the super graph to achieve better class label separation among the graphs. We
conduct exhaustive empirical evaluations of our proposed method and show that
it outperforms both the adaptation of state of the art graph classification
methods to few shot scenario and our naive baseline GNNs. Additionally, we also
extend and study the behavior of our method to semi supervised and active
learning scenarios.
Related papers
- A Class-Aware Representation Refinement Framework for Graph Classification [8.998543739618077]
We propose a Class-Aware Representation rEfinement (CARE) framework for the task of graph classification.
CARE computes simple yet powerful class representations and injects them to steer the learning of graph representations towards better class separability.
Our experiments with 11 well-known GNN backbones on 9 benchmark datasets validate the superiority and effectiveness of CARE over its GNN counterparts.
arXiv Detail & Related papers (2022-09-02T10:18:33Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Semi-Supervised Hierarchical Graph Classification [54.25165160435073]
We study the node classification problem in the hierarchical graph where a 'node' is a graph instance.
We propose the Hierarchical Graph Mutual Information (HGMI) and present a way to compute HGMI with theoretical guarantee.
We demonstrate the effectiveness of this hierarchical graph modeling and the proposed SEAL-CI method on text and social network data.
arXiv Detail & Related papers (2022-06-11T04:05:29Z) - Graph Summarization with Graph Neural Networks [2.449909275410288]
We use Graph Neural Networks to represent large graphs in a structured and compact way.
We compare different GNNs with a standard multi-layer perceptron (MLP) and Bloom filter as non-neural method.
Our results show that the performance of GNNs are close to each other.
arXiv Detail & Related papers (2022-03-11T13:45:34Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Neighborhood Random Walk Graph Sampling for Regularized Bayesian Graph
Convolutional Neural Networks [0.6236890292833384]
In this paper, we propose a novel algorithm called Bayesian Graph Convolutional Network using Neighborhood Random Walk Sampling (BGCN-NRWS)
BGCN-NRWS uses a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm utilizing graph structure, reduces overfitting by using a variational inference layer, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification.
arXiv Detail & Related papers (2021-12-14T20:58:27Z) - Imbalanced Graph Classification via Graph-of-Graph Neural Networks [16.589373163769853]
Graph Neural Networks (GNNs) have achieved unprecedented success in learning graph representations to identify categorical labels of graphs.
We introduce a novel framework, Graph-of-Graph Neural Networks (G$2$GNN), which alleviates the graph imbalance issue by deriving extra supervision globally from neighboring graphs and locally from graphs themselves.
Our proposed G$2$GNN outperforms numerous baselines by roughly 5% in both F1-macro and F1-micro scores.
arXiv Detail & Related papers (2021-12-01T02:25:47Z) - Structure-Enhanced Meta-Learning For Few-Shot Graph Classification [53.54066611743269]
This work explores the potential of metric-based meta-learning for solving few-shot graph classification.
An implementation upon GIN, named SMFGIN, is tested on two datasets, Chembl and TRIANGLES.
arXiv Detail & Related papers (2021-03-05T09:03:03Z) - Certified Robustness of Graph Classification against Topology Attack
with Randomized Smoothing [22.16111584447466]
Graph-based machine learning models are vulnerable to adversarial perturbations due to the non i.i.d nature of graph data.
We build a smoothed graph classification model with certified robustness guarantee.
We also evaluate the effectiveness of our approach under graph convolutional network (GCN) based multi-class graph classification model.
arXiv Detail & Related papers (2020-09-12T22:18:54Z) - Inverse Graph Identification: Can We Identify Node Labels Given Graph
Labels? [89.13567439679709]
Graph Identification (GI) has long been researched in graph learning and is essential in certain applications.
This paper defines a novel problem dubbed Inverse Graph Identification (IGI)
We propose a simple yet effective method that makes the node-level message passing process using Graph Attention Network (GAT) under the protocol of GI.
arXiv Detail & Related papers (2020-07-12T12:06:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.