Hyperbolic Graph Neural Networks at Scale: A Meta Learning Approach
- URL: http://arxiv.org/abs/2310.18918v1
- Date: Sun, 29 Oct 2023 06:11:49 GMT
- Title: Hyperbolic Graph Neural Networks at Scale: A Meta Learning Approach
- Authors: Nurendra Choudhary and Nikhil Rao and Chandan K. Reddy
- Abstract summary: We introduce a novel method, Hyperbolic GRAph Meta Learner (H-GRAM), for the tasks of node classification and link prediction.
H-GRAM learns transferable information from a set of support local subgraphs in the form of hyperbolic meta gradients and label hyperbolic protonets.
Our comparative analysis shows that H-GRAM effectively learns and transfers information in multiple challenging few-shot settings.
- Score: 19.237565246362134
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The progress in hyperbolic neural networks (HNNs) research is hindered by
their absence of inductive bias mechanisms, which are essential for
generalizing to new tasks and facilitating scalable learning over large
datasets. In this paper, we aim to alleviate these issues by learning
generalizable inductive biases from the nodes' local subgraph and transfer them
for faster learning over new subgraphs with a disjoint set of nodes, edges, and
labels in a few-shot setting. We introduce a novel method, Hyperbolic GRAph
Meta Learner (H-GRAM), that, for the tasks of node classification and link
prediction, learns transferable information from a set of support local
subgraphs in the form of hyperbolic meta gradients and label hyperbolic
protonets to enable faster learning over a query set of new tasks dealing with
disjoint subgraphs. Furthermore, we show that an extension of our meta-learning
framework also mitigates the scalability challenges seen in HNNs faced by
existing approaches. Our comparative analysis shows that H-GRAM effectively
learns and transfers information in multiple challenging few-shot settings
compared to other state-of-the-art baselines. Additionally, we demonstrate
that, unlike standard HNNs, our approach is able to scale over large graph
datasets and improve performance over its Euclidean counterparts.
Related papers
- LAMP: Learnable Meta-Path Guided Adversarial Contrastive Learning for Heterogeneous Graphs [22.322402072526927]
Heterogeneous Graph Contrastive Learning (HGCL) usually requires pre-defined meta-paths.
textsfLAMP integrates various meta-path sub-graphs into a unified and stable structure.
textsfLAMP significantly outperforms existing state-of-the-art unsupervised models in terms of accuracy and robustness.
arXiv Detail & Related papers (2024-09-10T08:27:39Z) - Enhancing Graph Neural Networks with Limited Labeled Data by Actively Distilling Knowledge from Large Language Models [30.867447814409623]
Graph neural networks (GNNs) have great ability in node classification, a fundamental task on graphs.
We propose a novel approach that integrates Large Language Models (LLMs) and GNNs.
Our model in improving node classification accuracy with considerably limited labeled data, surpassing state-of-the-art baselines by significant margins.
arXiv Detail & Related papers (2024-07-19T02:34:10Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Prototype-Enhanced Hypergraph Learning for Heterogeneous Information
Networks [22.564818600608838]
We introduce a novel prototype-enhanced hypergraph learning approach for node classification in Heterogeneous Information Networks.
Our method captures higher-order relationships among nodes and extracts semantic information without relying on metapaths.
arXiv Detail & Related papers (2023-09-22T09:51:15Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Meta Propagation Networks for Graph Few-shot Semi-supervised Learning [39.96930762034581]
We propose a novel network architecture equipped with a novel meta-learning algorithm to solve this problem.
In essence, our framework Meta-PN infers high-quality pseudo labels on unlabeled nodes via a meta-learned label propagation strategy.
Our approach offers easy and substantial performance gains compared to existing techniques on various benchmark datasets.
arXiv Detail & Related papers (2021-12-18T00:11:56Z) - Weakly-supervised Graph Meta-learning for Few-shot Node Classification [53.36828125138149]
We propose a new graph meta-learning framework -- Graph Hallucination Networks (Meta-GHN)
Based on a new robustness-enhanced episodic training, Meta-GHN is meta-learned to hallucinate clean node representations from weakly-labeled data.
Extensive experiments demonstrate the superiority of Meta-GHN over existing graph meta-learning studies.
arXiv Detail & Related papers (2021-06-12T22:22:10Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.