Knowledge Distillation on Graphs: A Survey
- URL: http://arxiv.org/abs/2302.00219v1
- Date: Wed, 1 Feb 2023 03:51:05 GMT
- Title: Knowledge Distillation on Graphs: A Survey
- Authors: Yijun Tian, Shichao Pei, Xiangliang Zhang, Chuxu Zhang, Nitesh V.
Chawla
- Abstract summary: knowledge distillation on graphs (KDG) has been introduced to build a smaller yet effective model and exploit more knowledge from data.
We introduce KDG challenges and bases, then categorize and summarize existing works of KDG.
- Score: 43.60992916277081
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have attracted tremendous attention by
demonstrating their capability to handle graph data. However, they are
difficult to be deployed in resource-limited devices due to model sizes and
scalability constraints imposed by the multi-hop data dependency. In addition,
real-world graphs usually possess complex structural information and features.
Therefore, to improve the applicability of GNNs and fully encode the
complicated topological information, knowledge distillation on graphs (KDG) has
been introduced to build a smaller yet effective model and exploit more
knowledge from data, leading to model compression and performance improvement.
Recently, KDG has achieved considerable progress with many studies proposed. In
this survey, we systematically review these works. Specifically, we first
introduce KDG challenges and bases, then categorize and summarize existing
works of KDG by answering the following three questions: 1) what to distillate,
2) who to whom, and 3) how to distillate. Finally, we share our thoughts on
future research directions.
Related papers
- A Survey on Graph Condensation [14.94630644865636]
Graph condensation (GC) has emerged as a solution to address challenges arising from the escalating volume of graph data.
For a better understanding of GC and to distinguish it from other related topics, we present a formal definition of GC and establish a taxonomy.
We conclude by addressing challenges and limitations, outlining future directions, and offering concise guidelines to inspire future research in this field.
arXiv Detail & Related papers (2024-02-03T02:50:51Z) - Graph Domain Adaptation: Challenges, Progress and Prospects [61.9048172631524]
We propose graph domain adaptation as an effective knowledge-transfer paradigm across graphs.
GDA introduces a bunch of task-related graphs as source graphs and adapts the knowledge learnt from source graphs to the target graphs.
We outline the research status and challenges, propose a taxonomy, introduce the details of representative works, and discuss the prospects.
arXiv Detail & Related papers (2024-02-01T02:44:32Z) - Towards Data-centric Graph Machine Learning: Review and Outlook [120.64417630324378]
We introduce a systematic framework, Data-centric Graph Machine Learning (DC-GML), that encompasses all stages of the graph data lifecycle.
A thorough taxonomy of each stage is presented to answer three critical graph-centric questions.
We pinpoint the future prospects of the DC-GML domain, providing insights to navigate its advancements and applications.
arXiv Detail & Related papers (2023-09-20T00:40:13Z) - Graph-based Knowledge Distillation: A survey and experimental evaluation [4.713436329217004]
Knowledge Distillation (KD) has been introduced to enhance existing Graph Neural Networks (GNNs)
KD involves transferring the soft-label supervision of the large teacher model to the small student model while maintaining prediction performance.
This paper first introduces the background of graph and KD. It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods.
arXiv Detail & Related papers (2023-02-27T11:39:23Z) - A Comprehensive Survey on Graph Summarization with Graph Neural Networks [21.337505372979066]
In the past, most graph summarization techniques sought to capture the most important part of a graph statistically.
Today, the high dimensionality and complexity of modern graph data are making deep learning techniques more popular.
Our investigation includes a review of the current state-of-the-art approaches, including recurrent GNNs, convolutional GNNs, graph autoencoders, and graph attention networks.
arXiv Detail & Related papers (2023-02-13T05:43:24Z) - A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic,
and Multimodal [57.8455911689554]
Knowledge graph reasoning (KGR) aims to deduce new facts from existing facts based on mined logic rules underlying knowledge graphs (KGs)
It has been proven to significantly benefit the usage of KGs in many AI applications, such as question answering, recommendation systems, and etc.
arXiv Detail & Related papers (2022-12-12T08:40:04Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Data Augmentation for Deep Graph Learning: A Survey [66.04015540536027]
We first propose a taxonomy for graph data augmentation and then provide a structured review by categorizing the related work based on the augmented information modalities.
Focusing on the two challenging problems in DGL (i.e., optimal graph learning and low-resource graph learning), we also discuss and review the existing learning paradigms which are based on graph data augmentation.
arXiv Detail & Related papers (2022-02-16T18:30:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.