Flexible Neural Image Compression via Code Editing
- URL: http://arxiv.org/abs/2209.09244v1
- Date: Mon, 19 Sep 2022 09:41:43 GMT
- Title: Flexible Neural Image Compression via Code Editing
- Authors: Chenjian Gao, Tongda Xu, Dailan He, Hongwei Qin, Yan Wang
- Abstract summary: Neural image compression (NIC) has outperformed traditional image codecs in ratedistortion (R-D) performance.
It usually requires a dedicated encoder-decoder pair for each point on R-D curve, which greatly hinders its practical deployment.
We propose Code Editing, a highly flexible coding method for NIC based on semi-amortized inference and adaptive quantization.
- Score: 8.499248314440557
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural image compression (NIC) has outperformed traditional image codecs in
rate-distortion (R-D) performance. However, it usually requires a dedicated
encoder-decoder pair for each point on R-D curve, which greatly hinders its
practical deployment. While some recent works have enabled bitrate control via
conditional coding, they impose strong prior during training and provide
limited flexibility. In this paper we propose Code Editing, a highly flexible
coding method for NIC based on semi-amortized inference and adaptive
quantization. Our work is a new paradigm for variable bitrate NIC. Furthermore,
experimental results show that our method surpasses existing variable-rate
methods, and achieves ROI coding and multi-distortion trade-off with a single
decoder.
Related papers
- Rate-Distortion-Cognition Controllable Versatile Neural Image Compression [47.72668401825835]
We propose a rate-distortion-cognition controllable versatile image compression method.
Our method yields satisfactory ICM performance and flexible Rate-DistortionCognition controlling.
arXiv Detail & Related papers (2024-07-16T13:17:51Z) - Once-for-All: Controllable Generative Image Compression with Dynamic Granularity Adaption [57.056311855630916]
We propose a Controllable Generative Image Compression framework, Control-GIC.
It is capable of fine-grained adaption across a broad spectrum while ensuring high-fidelity and generality compression.
We develop a conditional conditionalization that can trace back to historic encoded multi-granularity representations.
arXiv Detail & Related papers (2024-06-02T14:22:09Z) - Enhancing the Rate-Distortion-Perception Flexibility of Learned Image
Codecs with Conditional Diffusion Decoders [7.485128109817576]
We show that conditional diffusion models can lead to promising results in the generative compression task when used as a decoder.
In this paper, we show that conditional diffusion models can lead to promising results in the generative compression task when used as a decoder.
arXiv Detail & Related papers (2024-03-05T11:48:35Z) - Neural Data-Dependent Transform for Learned Image Compression [72.86505042102155]
We build a neural data-dependent transform and introduce a continuous online mode decision mechanism to jointly optimize the coding efficiency for each individual image.
The experimental results show the effectiveness of the proposed neural-syntax design and the continuous online mode decision mechanism.
arXiv Detail & Related papers (2022-03-09T14:56:48Z) - Neural JPEG: End-to-End Image Compression Leveraging a Standard JPEG
Encoder-Decoder [73.48927855855219]
We propose a system that learns to improve the encoding performance by enhancing its internal neural representations on both the encoder and decoder ends.
Experiments demonstrate that our approach successfully improves the rate-distortion performance over JPEG across various quality metrics.
arXiv Detail & Related papers (2022-01-27T20:20:03Z) - Rate Distortion Characteristic Modeling for Neural Image Compression [59.25700168404325]
End-to-end optimization capability offers neural image compression (NIC) superior lossy compression performance.
distinct models are required to be trained to reach different points in the rate-distortion (R-D) space.
We make efforts to formulate the essential mathematical functions to describe the R-D behavior of NIC using deep network and statistical modeling.
arXiv Detail & Related papers (2021-06-24T12:23:05Z) - Substitutional Neural Image Compression [48.20906717052056]
Substitutional Neural Image Compression (SNIC) is a general approach for enhancing any neural image compression model.
It boosts compression performance toward a flexible distortion metric and enables bit-rate control using a single model instance.
arXiv Detail & Related papers (2021-05-16T20:53:31Z) - Learned Multi-Resolution Variable-Rate Image Compression with
Octave-based Residual Blocks [15.308823742699039]
We propose a new variable-rate image compression framework, which employs generalized octave convolutions (GoConv) and generalized octave transposed-convolutions (GoTConv)
To enable a single model to operate with different bit rates and to learn multi-rate image features, a new objective function is introduced.
Experimental results show that the proposed framework trained with variable-rate objective function outperforms the standard codecs such as H.265/HEVC-based BPG and state-of-the-art learning-based variable-rate methods.
arXiv Detail & Related papers (2020-12-31T06:26:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.