Quantum Implicit Neural Compression
- URL: http://arxiv.org/abs/2412.19828v1
- Date: Thu, 19 Dec 2024 13:41:29 GMT
- Title: Quantum Implicit Neural Compression
- Authors: Takuya Fujihashi, Toshiaki Koike-Akino,
- Abstract summary: We introduce quantum INR, which leverages the exponentially rich expressivity of quantum neural networks for data compression.
Evaluations using some benchmark datasets show that the proposed quINR-based compression could improve rate-distortion performance in image compression.
- Score: 11.028123436097616
- License:
- Abstract: Signal compression based on implicit neural representation (INR) is an emerging technique to represent multimedia signals with a small number of bits. While INR-based signal compression achieves high-quality reconstruction for relatively low-resolution signals, the accuracy of high-frequency details is significantly degraded with a small model. To improve the compression efficiency of INR, we introduce quantum INR (quINR), which leverages the exponentially rich expressivity of quantum neural networks for data compression. Evaluations using some benchmark datasets show that the proposed quINR-based compression could improve rate-distortion performance in image compression compared with traditional codecs and classic INR-based coding methods, up to 1.2dB gain.
Related papers
- Streaming Neural Images [56.41827271721955]
Implicit Neural Representations (INRs) are a novel paradigm for signal representation that have attracted considerable interest for image compression.
In this work, we explore the critical yet overlooked limiting factors of INRs, such as computational cost, unstable performance, and robustness.
arXiv Detail & Related papers (2024-09-25T17:51:20Z) - UniCompress: Enhancing Multi-Data Medical Image Compression with Knowledge Distillation [59.3877309501938]
Implicit Neural Representation (INR) networks have shown remarkable versatility due to their flexible compression ratios.
We introduce a codebook containing frequency domain information as a prior input to the INR network.
This enhances the representational power of INR and provides distinctive conditioning for different image blocks.
arXiv Detail & Related papers (2024-05-27T05:52:13Z) - Semantic Ensemble Loss and Latent Refinement for High-Fidelity Neural Image Compression [58.618625678054826]
This study presents an enhanced neural compression method designed for optimal visual fidelity.
We have trained our model with a sophisticated semantic ensemble loss, integrating Charbonnier loss, perceptual loss, style loss, and a non-binary adversarial loss.
Our empirical findings demonstrate that this approach significantly improves the statistical fidelity of neural image compression.
arXiv Detail & Related papers (2024-01-25T08:11:27Z) - Modality-Agnostic Variational Compression of Implicit Neural
Representations [96.35492043867104]
We introduce a modality-agnostic neural compression algorithm based on a functional view of data and parameterised as an Implicit Neural Representation (INR)
Bridging the gap between latent coding and sparsity, we obtain compact latent representations non-linearly mapped to a soft gating mechanism.
After obtaining a dataset of such latent representations, we directly optimise the rate/distortion trade-off in a modality-agnostic space using neural compression.
arXiv Detail & Related papers (2023-01-23T15:22:42Z) - SINCO: A Novel structural regularizer for image compression using
implicit neural representations [10.251120382395332]
Implicit neural representations (INR) have been recently proposed as deep learning (DL) based solutions for image compression.
We present structural regularization for INR compression (SINCO) as a novel INR method for image compression.
arXiv Detail & Related papers (2022-10-26T18:35:54Z) - Neural JPEG: End-to-End Image Compression Leveraging a Standard JPEG
Encoder-Decoder [73.48927855855219]
We propose a system that learns to improve the encoding performance by enhancing its internal neural representations on both the encoder and decoder ends.
Experiments demonstrate that our approach successfully improves the rate-distortion performance over JPEG across various quality metrics.
arXiv Detail & Related papers (2022-01-27T20:20:03Z) - Implicit Neural Representations for Image Compression [103.78615661013623]
Implicit Neural Representations (INRs) have gained attention as a novel and effective representation for various data types.
We propose the first comprehensive compression pipeline based on INRs including quantization, quantization-aware retraining and entropy coding.
We find that our approach to source compression with INRs vastly outperforms similar prior work.
arXiv Detail & Related papers (2021-12-08T13:02:53Z) - A GAN-based Tunable Image Compression System [13.76136694287327]
This paper rethinks content-based compression by using Generative Adversarial Network (GAN) to reconstruct the non-important regions.
A tunable compression scheme is also proposed in this paper to compress an image to any specific compression ratio without retraining the model.
arXiv Detail & Related papers (2020-01-18T02:40:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.