Rate-Distortion-Perception Tradeoff Based on the
Conditional-Distribution Perception Measure
- URL: http://arxiv.org/abs/2401.12207v1
- Date: Mon, 22 Jan 2024 18:49:56 GMT
- Title: Rate-Distortion-Perception Tradeoff Based on the
Conditional-Distribution Perception Measure
- Authors: Sadaf Salehkalaibar, Jun Chen, Ashish Khisti and Wei Yu
- Abstract summary: We study the rate-distortionperception (RDP) tradeoff for a memoryless source model in the limit of large blocklengths.
Our perception measure is based on a divergence between the distributions of the source and reconstruction sequences conditioned on the encoder output.
- Score: 33.084834042565895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the rate-distortion-perception (RDP) tradeoff for a memoryless
source model in the asymptotic limit of large block-lengths. Our perception
measure is based on a divergence between the distributions of the source and
reconstruction sequences conditioned on the encoder output, which was first
proposed in [1], [2]. We consider the case when there is no shared randomness
between the encoder and the decoder. For the case of discrete memoryless
sources we derive a single-letter characterization of the RDP function, thus
settling a problem that remains open for the marginal metric introduced in Blau
and Michaeli [3] (with no shared randomness). Our achievability scheme is based
on lossy source coding with a posterior reference map proposed in [4]. For the
case of continuous valued sources under squared error distortion measure and
squared quadratic Wasserstein perception measure we also derive a single-letter
characterization and show that a noise-adding mechanism at the decoder suffices
to achieve the optimal representation. For the case of zero perception loss, we
show that our characterization interestingly coincides with the results for the
marginal metric derived in [5], [6] and again demonstrate that zero perception
loss can be achieved with a $3$-dB penalty in the minimum distortion. Finally
we specialize our results to the case of Gaussian sources. We derive the RDP
function for vector Gaussian sources and propose a waterfilling type solution.
We also partially characterize the RDP function for a mixture of vector
Gaussians.
Related papers
- The Rate-Distortion-Perception Trade-off: The Role of Private Randomness [53.81648040452621]
We show that private randomness is not useful if the compression rate is lower than the entropy of the source.
We characterize the corresponding rate-distortion trade-off and show that private randomness is not useful if the compression rate is lower than the entropy of the source.
arXiv Detail & Related papers (2024-04-01T13:36:01Z) - Output-Constrained Lossy Source Coding With Application to Rate-Distortion-Perception Theory [9.464977414419332]
The distortion-rate function of output-constrained lossy source coding with limited common randomness is analyzed.
An explicit expression is obtained when both source and reconstruction distributions are Gaussian.
arXiv Detail & Related papers (2024-03-21T21:51:36Z) - Characterization of the Distortion-Perception Tradeoff for Finite
Channels with Arbitrary Metrics [31.383958289479015]
We study the distortion-perception tradeoff over finite-alphabet channels.
We show that computing the DP function and the optimal reconstructions is equivalent to solving a set of linear programming problems.
arXiv Detail & Related papers (2024-02-03T21:17:15Z) - On the Computation of the Gaussian Rate-Distortion-Perception Function [10.564071872770146]
We study the computation of the rate-distortion-perception function (RDPF) for a multivariate Gaussian source under mean squared error (MSE) distortion.
We provide the associated algorithmic realization, as well as the convergence and the rate of convergence characterization.
We corroborate our results with numerical simulations and draw connections to existing results.
arXiv Detail & Related papers (2023-11-15T18:34:03Z) - The Rate-Distortion-Perception Tradeoff: The Role of Common Randomness [23.37690979017006]
This paper focuses on the case of perfect realism, which coincides with the problem of distribution-preserving lossy compression.
The existing tradeoff is recovered by allowing for the amount of common randomness to be infinite.
arXiv Detail & Related papers (2022-02-08T21:14:57Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - On the Double Descent of Random Features Models Trained with SGD [78.0918823643911]
We study properties of random features (RF) regression in high dimensions optimized by gradient descent (SGD)
We derive precise non-asymptotic error bounds of RF regression under both constant and adaptive step-size SGD setting.
We observe the double descent phenomenon both theoretically and empirically.
arXiv Detail & Related papers (2021-10-13T17:47:39Z) - A Theory of the Distortion-Perception Tradeoff in Wasserstein Space [35.25746003630763]
lower the distortion of an estimator, the more the distribution of its outputs deviates from the distribution of the signals it attempts to estimate.
This phenomenon has captured significant attention in image restoration, where it implies that fidelity to ground truth images comes at the expense of perceptual quality.
We show how estimators can be constructed from the estimators at the two extremes of the perception-distortion tradeoff.
arXiv Detail & Related papers (2021-07-06T11:53:36Z) - Direct Measure Matching for Crowd Counting [59.66286603624411]
We propose a new measure-based counting approach to regress the predicted density maps to the scattered point-annotated ground truth directly.
In this paper, we derive a semi-balanced form of Sinkhorn divergence, based on which a Sinkhorn counting loss is designed for measure matching.
arXiv Detail & Related papers (2021-07-04T06:37:33Z) - Uncertainty Inspired RGB-D Saliency Detection [70.50583438784571]
We propose the first framework to employ uncertainty for RGB-D saliency detection by learning from the data labeling process.
Inspired by the saliency data labeling process, we propose a generative architecture to achieve probabilistic RGB-D saliency detection.
Results on six challenging RGB-D benchmark datasets show our approach's superior performance in learning the distribution of saliency maps.
arXiv Detail & Related papers (2020-09-07T13:01:45Z) - Approximation Schemes for ReLU Regression [80.33702497406632]
We consider the fundamental problem of ReLU regression.
The goal is to output the best fitting ReLU with respect to square loss given to draws from some unknown distribution.
arXiv Detail & Related papers (2020-05-26T16:26:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.