Content-Aware GAN Compression
- URL: http://arxiv.org/abs/2104.02244v1
- Date: Tue, 6 Apr 2021 02:23:56 GMT
- Title: Content-Aware GAN Compression
- Authors: Yuchen Liu, Zhixin Shu, Yijun Li, Zhe Lin, Federico Perazzi, S.Y. Kung
- Abstract summary: Generative adversarial networks (GANs) play a vital role in various image generation and synthesis tasks.
Their notoriously high computational cost hinders their efficient deployment on edge devices.
We propose novel approaches for unconditional GAN compression.
- Score: 33.83749494060526
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative adversarial networks (GANs), e.g., StyleGAN2, play a vital role in
various image generation and synthesis tasks, yet their notoriously high
computational cost hinders their efficient deployment on edge devices. Directly
applying generic compression approaches yields poor results on GANs, which
motivates a number of recent GAN compression works. While prior works mainly
accelerate conditional GANs, e.g., pix2pix and CycleGAN, compressing
state-of-the-art unconditional GANs has rarely been explored and is more
challenging. In this paper, we propose novel approaches for unconditional GAN
compression. We first introduce effective channel pruning and knowledge
distillation schemes specialized for unconditional GANs. We then propose a
novel content-aware method to guide the processes of both pruning and
distillation. With content-awareness, we can effectively prune channels that
are unimportant to the contents of interest, e.g., human faces, and focus our
distillation on these regions, which significantly enhances the distillation
quality. On StyleGAN2 and SN-GAN, we achieve a substantial improvement over the
state-of-the-art compression method. Notably, we reduce the FLOPs of StyleGAN2
by 11x with visually negligible image quality loss compared to the full-size
model. More interestingly, when applied to various image manipulation tasks,
our compressed model forms a smoother and better disentangled latent manifold,
making it more effective for image editing.
Related papers
- DGNet: Dynamic Gradient-Guided Network for Water-Related Optics Image
Enhancement [77.0360085530701]
Underwater image enhancement (UIE) is a challenging task due to the complex degradation caused by underwater environments.
Previous methods often idealize the degradation process, and neglect the impact of medium noise and object motion on the distribution of image features.
Our approach utilizes predicted images to dynamically update pseudo-labels, adding a dynamic gradient to optimize the network's gradient space.
arXiv Detail & Related papers (2023-12-12T06:07:21Z) - Online Multi-Granularity Distillation for GAN Compression [17.114017187236836]
Generative Adversarial Networks (GANs) have witnessed prevailing success in yielding outstanding images.
GANs are burdensome to deploy on resource-constrained devices due to ponderous computational costs and hulking memory usage.
We propose a novel online multi-granularity distillation scheme to obtain lightweight GANs.
arXiv Detail & Related papers (2021-08-16T05:49:50Z) - GAN Slimming: All-in-One GAN Compression by A Unified Optimization
Framework [94.26938614206689]
We propose the first unified optimization framework combining multiple compression means for GAN compression, dubbed GAN Slimming.
We apply GS to compress CartoonGAN, a state-of-the-art style transfer network, by up to 47 times, with minimal visual quality degradation.
arXiv Detail & Related papers (2020-08-25T14:39:42Z) - Early Exit or Not: Resource-Efficient Blind Quality Enhancement for
Compressed Images [54.40852143927333]
Lossy image compression is pervasively conducted to save communication bandwidth, resulting in undesirable compression artifacts.
We propose a resource-efficient blind quality enhancement (RBQE) approach for compressed images.
Our approach can automatically decide to terminate or continue enhancement according to the assessed quality of enhanced images.
arXiv Detail & Related papers (2020-06-30T07:38:47Z) - AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks [98.71508718214935]
Existing GAN compression algorithms are limited to handling specific GAN architectures and losses.
Inspired by the recent success of AutoML in deep compression, we introduce AutoML to GAN compression and develop an AutoGAN-Distiller framework.
We evaluate AGD in two representative GAN tasks: image translation and super resolution.
arXiv Detail & Related papers (2020-06-15T07:56:24Z) - GAN Compression: Efficient Architectures for Interactive Conditional
GANs [45.012173624111185]
Recent Conditional Generative Adversarial Networks (cGANs) are 1-2 orders of magnitude more compute-intensive than modern recognition CNNs.
We propose a general-purpose compression framework for reducing the inference time and model size of the generator in cGANs.
arXiv Detail & Related papers (2020-03-19T17:59:05Z) - Blur, Noise, and Compression Robust Generative Adversarial Networks [85.68632778835253]
We propose blur, noise, and compression robust GAN (BNCR-GAN) to learn a clean image generator directly from degraded images.
Inspired by NR-GAN, BNCR-GAN uses a multiple-generator model composed of image, blur- Kernel, noise, and quality-factor generators.
We demonstrate the effectiveness of BNCR-GAN through large-scale comparative studies on CIFAR-10 and a generality analysis on FFHQ.
arXiv Detail & Related papers (2020-03-17T17:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.