Gaussian Rate-Distortion-Perception Coding and Entropy-Constrained Scalar Quantization
- URL: http://arxiv.org/abs/2409.02388v1
- Date: Wed, 4 Sep 2024 02:31:53 GMT
- Title: Gaussian Rate-Distortion-Perception Coding and Entropy-Constrained Scalar Quantization
- Authors: Li Xie, Liangyan Li, Jun Chen, Lei Yu, Zhongshan Zhang,
- Abstract summary: This paper investigates the best known bounds on the quadratic Gaussian distortion-rate-perception function with limited common randomness.
The bounds are nondegenerate in the sense that they cannot be deduced from each other via a refined version of Talagrand's transportation inequality.
An improved lower bound is established when the perception measure is given by the squared Wasserstein-2 distance.
- Score: 12.575809787716771
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper investigates the best known bounds on the quadratic Gaussian distortion-rate-perception function with limited common randomness for the Kullback-Leibler divergence-based perception measure, as well as their counterparts for the squared Wasserstein-2 distance-based perception measure, recently established by Xie et al. These bounds are shown to be nondegenerate in the sense that they cannot be deduced from each other via a refined version of Talagrand's transportation inequality. On the other hand, an improved lower bound is established when the perception measure is given by the squared Wasserstein-2 distance. In addition, it is revealed by exploiting the connection between rate-distortion-perception coding and entropy-constrained scalar quantization that all the aforementioned bounds are generally not tight in the weak perception constraint regime.
Related papers
- Universal bounds in quantum metrology in presence of correlated noise [0.0]
We derive fundamental bounds for general quantum metrological models involving both temporal or spatial correlations.
Although the bounds are not guaranteed to be tight in general, their tightness may be systematically increased by increasing numerical complexity.
arXiv Detail & Related papers (2024-10-02T18:00:00Z) - Output-Constrained Lossy Source Coding With Application to Rate-Distortion-Perception Theory [9.464977414419332]
The distortion-rate function of output-constrained lossy source coding with limited common randomness is analyzed.
An explicit expression is obtained when both source and reconstruction distributions are Gaussian.
arXiv Detail & Related papers (2024-03-21T21:51:36Z) - A U-turn on Double Descent: Rethinking Parameter Counting in Statistical
Learning [68.76846801719095]
We show that double descent appears exactly when and where it occurs, and that its location is not inherently tied to the threshold p=n.
This provides a resolution to tensions between double descent and statistical intuition.
arXiv Detail & Related papers (2023-10-29T12:05:39Z) - Quadratic pseudospectrum for identifying localized states [68.8204255655161]
quadratic pseudospectrum is a method for approaching systems with incompatible observables.
We derive an important estimate relating the Clifford and quadratic pseudospectra.
We prove that the quadratic pseudospectrum is local, and derive bounds on the errors that are incurred by truncating the system in the vicinity of where the pseudospectrum is being calculated.
arXiv Detail & Related papers (2022-04-22T00:57:09Z) - Sharp Bounds for Federated Averaging (Local SGD) and Continuous
Perspective [49.17352150219212]
Federated AveragingFedAvg, also known as Local SGD, is one of the most popular algorithms in Federated Learning (FL)
We show how to analyze this quantity from the Differential Equation (SDE) perspective.
arXiv Detail & Related papers (2021-11-05T22:16:11Z) - Tight Exponential Analysis for Smoothing the Max-Relative Entropy and
for Quantum Privacy Amplification [56.61325554836984]
The max-relative entropy together with its smoothed version is a basic tool in quantum information theory.
We derive the exact exponent for the decay of the small modification of the quantum state in smoothing the max-relative entropy based on purified distance.
arXiv Detail & Related papers (2021-11-01T16:35:41Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Rethinking Rotated Object Detection with Gaussian Wasserstein Distance
Loss [111.8807588392563]
Boundary discontinuity and its inconsistency to the final detection metric have been the bottleneck for rotating detection regression loss design.
We propose a novel regression loss based on Gaussian Wasserstein distance as a fundamental approach to solve the problem.
arXiv Detail & Related papers (2021-01-28T12:04:35Z) - Tighter expected generalization error bounds via Wasserstein distance [23.52237892358981]
We introduce several expected generalization error bounds based on the Wasserstein distance.
We present full-dataset, single-letter, and random-subset bounds on both the standard setting and the randomized-subsample setting.
We show how various new bounds based on different information measures can be derived from the presented bounds.
arXiv Detail & Related papers (2021-01-22T20:13:59Z) - Tightening the tripartite quantum memory assisted entropic uncertainty
relation [0.0]
In quantum information theory, Shannon entropy has been used as an appropriate measure to express the uncertainty relation.
One can extend the bipartite quantum memory assisted entropic uncertainty relation to tripartite quantum memory assisted uncertainty relation.
arXiv Detail & Related papers (2020-05-05T12:51:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.