AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks
- URL: http://arxiv.org/abs/2006.08198v2
- Date: Mon, 6 Jul 2020 15:41:44 GMT
- Title: AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks
- Authors: Yonggan Fu, Wuyang Chen, Haotao Wang, Haoran Li, Yingyan Lin,
Zhangyang Wang
- Abstract summary: Existing GAN compression algorithms are limited to handling specific GAN architectures and losses.
Inspired by the recent success of AutoML in deep compression, we introduce AutoML to GAN compression and develop an AutoGAN-Distiller framework.
We evaluate AGD in two representative GAN tasks: image translation and super resolution.
- Score: 98.71508718214935
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The compression of Generative Adversarial Networks (GANs) has lately drawn
attention, due to the increasing demand for deploying GANs into mobile devices
for numerous applications such as image translation, enhancement and editing.
However, compared to the substantial efforts to compressing other deep models,
the research on compressing GANs (usually the generators) remains at its
infancy stage. Existing GAN compression algorithms are limited to handling
specific GAN architectures and losses. Inspired by the recent success of AutoML
in deep compression, we introduce AutoML to GAN compression and develop an
AutoGAN-Distiller (AGD) framework. Starting with a specifically designed
efficient search space, AGD performs an end-to-end discovery for new efficient
generators, given the target computational resource constraints. The search is
guided by the original GAN model via knowledge distillation, therefore
fulfilling the compression. AGD is fully automatic, standalone (i.e., needing
no trained discriminators), and generically applicable to various GAN models.
We evaluate AGD in two representative GAN tasks: image translation and super
resolution. Without bells and whistles, AGD yields remarkably lightweight yet
more competitive compressed models, that largely outperform existing
alternatives. Our codes and pretrained models are available at
https://github.com/TAMU-VITA/AGD.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.