Revisiting Discriminator in GAN Compression: A Generator-discriminator
Cooperative Compression Scheme
- URL: http://arxiv.org/abs/2110.14439v1
- Date: Wed, 27 Oct 2021 13:54:55 GMT
- Title: Revisiting Discriminator in GAN Compression: A Generator-discriminator
Cooperative Compression Scheme
- Authors: ShaoJie Li, Jie Wu, Xuefeng Xiao, Fei Chao, Xudong Mao, Rongrong Ji
- Abstract summary: GAN compression aims to reduce tremendous computational overhead and memory usages when deploying GANs on resource-constrained edge devices.
In this work, we revisit the role of discriminator in GAN compression and design a novel generator-discriminator cooperative compression scheme for GAN compression, termed GCC.
- Score: 65.5405625485559
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, a series of algorithms have been explored for GAN compression,
which aims to reduce tremendous computational overhead and memory usages when
deploying GANs on resource-constrained edge devices. However, most of the
existing GAN compression work only focuses on how to compress the generator,
while fails to take the discriminator into account. In this work, we revisit
the role of discriminator in GAN compression and design a novel
generator-discriminator cooperative compression scheme for GAN compression,
termed GCC. Within GCC, a selective activation discriminator automatically
selects and activates convolutional channels according to a local capacity
constraint and a global coordination constraint, which help maintain the Nash
equilibrium with the lightweight generator during the adversarial training and
avoid mode collapse. The original generator and discriminator are also
optimized from scratch, to play as a teacher model to progressively refine the
pruned generator and the selective activation discriminator. A novel online
collaborative distillation scheme is designed to take full advantage of the
intermediate feature of the teacher generator and discriminator to further
boost the performance of the lightweight generator. Extensive experiments on
various GAN-based generation tasks demonstrate the effectiveness and
generalization of GCC. Among them, GCC contributes to reducing 80%
computational costs while maintains comparable performance in image translation
tasks. Our code and models are available at \url{https://github.com/SJLeo/GCC}.
Related papers
- Communication-Efficient Distributed Learning with Local Immediate Error
Compensation [95.6828475028581]
We propose the Local Immediate Error Compensated SGD (LIEC-SGD) optimization algorithm.
LIEC-SGD is superior to previous works in either the convergence rate or the communication cost.
arXiv Detail & Related papers (2024-02-19T05:59:09Z) - Discriminator-Cooperated Feature Map Distillation for GAN Compression [69.86835014810714]
We present an inventive discriminator-cooperated distillation, abbreviated as DCD, towards refining better feature maps from the generator.
Our DCD shows superior results compared with existing GAN compression methods.
arXiv Detail & Related papers (2022-12-29T03:50:27Z) - Generative Cooperative Networks for Natural Language Generation [25.090455367573988]
We introduce Generative Cooperative Networks, in which the discriminator architecture is cooperatively used along with the generation policy to output samples of realistic texts.
We give theoretical guarantees of convergence for our approach, and study various efficient decoding schemes to empirically achieve state-of-the-art results in two main NLG tasks.
arXiv Detail & Related papers (2022-01-28T18:36:57Z) - cGANs with Auxiliary Discriminative Classifier [43.78253518292111]
Conditional generative models aim to learn the underlying joint distribution of data and labels.
auxiliary classifier generative adversarial networks (AC-GAN) have been widely used, but suffer from the issue of low intra-class diversity on generated samples.
We propose novel cGANs with auxiliary discriminative classifier (ADC-GAN) to address the issue of AC-GAN.
arXiv Detail & Related papers (2021-07-21T13:06:32Z) - Cycle-free CycleGAN using Invertible Generator for Unsupervised Low-Dose
CT Denoising [33.79188588182528]
CycleGAN provides high-performance, ultra-fast denoising for low-dose X-ray computed tomography (CT) images.
CycleGAN requires two generators and two discriminators to enforce cycle consistency.
We present a novel cycle-free Cycle-GAN architecture, which consists of a single generator and a discriminator but still guarantees cycle consistency.
arXiv Detail & Related papers (2021-04-17T13:23:36Z) - Learning Efficient GANs for Image Translation via Differentiable Masks
and co-Attention Distillation [130.30465659190773]
Generative Adversarial Networks (GANs) have been widely-used in image translation, but their high computation and storage costs impede the deployment on mobile devices.
We introduce a novel GAN compression method, termed DMAD, by proposing a Differentiable Mask and a co-Attention Distillation.
Experiments show DMAD can reduce the Multiply Accumulate Operations (MACs) of CycleGAN by 13x and that of Pix2Pix by 4x while retaining a comparable performance against the full model.
arXiv Detail & Related papers (2020-11-17T02:39:19Z) - Mode Penalty Generative Adversarial Network with adapted Auto-encoder [0.15229257192293197]
We propose a mode penalty GAN combined with pre-trained auto encoder for explicit representation of generated and real data samples in encoded space.
We demonstrate that applying the proposed method to GANs helps generator's optimization becoming more stable and having faster convergence through experimental evaluations.
arXiv Detail & Related papers (2020-11-16T03:39:53Z) - GAN Slimming: All-in-One GAN Compression by A Unified Optimization
Framework [94.26938614206689]
We propose the first unified optimization framework combining multiple compression means for GAN compression, dubbed GAN Slimming.
We apply GS to compress CartoonGAN, a state-of-the-art style transfer network, by up to 47 times, with minimal visual quality degradation.
arXiv Detail & Related papers (2020-08-25T14:39:42Z) - AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks [98.71508718214935]
Existing GAN compression algorithms are limited to handling specific GAN architectures and losses.
Inspired by the recent success of AutoML in deep compression, we introduce AutoML to GAN compression and develop an AutoGAN-Distiller framework.
We evaluate AGD in two representative GAN tasks: image translation and super resolution.
arXiv Detail & Related papers (2020-06-15T07:56:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.