Leveraging Topological Guidance for Improved Knowledge Distillation
- URL: http://arxiv.org/abs/2407.05316v1
- Date: Sun, 7 Jul 2024 10:09:18 GMT
- Title: Leveraging Topological Guidance for Improved Knowledge Distillation
- Authors: Eun Som Jeon, Rahul Khurana, Aishani Pathak, Pavan Turaga,
- Abstract summary: We propose a framework called Topological Guidance-based Knowledge Distillation (TGD) for image classification tasks.
We utilize KD to train a superior lightweight model and provide topological features with multiple teachers simultaneously.
We introduce a mechanism for integrating features from different teachers and reducing the knowledge gap between teachers and the student.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning has shown its efficacy in extracting useful features to solve various computer vision tasks. However, when the structure of the data is complex and noisy, capturing effective information to improve performance is very difficult. To this end, topological data analysis (TDA) has been utilized to derive useful representations that can contribute to improving performance and robustness against perturbations. Despite its effectiveness, the requirements for large computational resources and significant time consumption in extracting topological features through TDA are critical problems when implementing it on small devices. To address this issue, we propose a framework called Topological Guidance-based Knowledge Distillation (TGD), which uses topological features in knowledge distillation (KD) for image classification tasks. We utilize KD to train a superior lightweight model and provide topological features with multiple teachers simultaneously. We introduce a mechanism for integrating features from different teachers and reducing the knowledge gap between teachers and the student, which aids in improving performance. We demonstrate the effectiveness of our approach through diverse empirical evaluations.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.