A Theory of the Distortion-Perception Tradeoff in Wasserstein Space
- URL: http://arxiv.org/abs/2107.02555v1
- Date: Tue, 6 Jul 2021 11:53:36 GMT
- Title: A Theory of the Distortion-Perception Tradeoff in Wasserstein Space
- Authors: Dror Freirich, Tomer Michaeli, Ron Meir
- Abstract summary: lower the distortion of an estimator, the more the distribution of its outputs deviates from the distribution of the signals it attempts to estimate.
This phenomenon has captured significant attention in image restoration, where it implies that fidelity to ground truth images comes at the expense of perceptual quality.
We show how estimators can be constructed from the estimators at the two extremes of the perception-distortion tradeoff.
- Score: 35.25746003630763
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The lower the distortion of an estimator, the more the distribution of its
outputs generally deviates from the distribution of the signals it attempts to
estimate. This phenomenon, known as the perception-distortion tradeoff, has
captured significant attention in image restoration, where it implies that
fidelity to ground truth images comes at the expense of perceptual quality
(deviation from statistics of natural images). However, despite the increasing
popularity of performing comparisons on the perception-distortion plane, there
remains an important open question: what is the minimal distortion that can be
achieved under a given perception constraint? In this paper, we derive a closed
form expression for this distortion-perception (DP) function for the mean
squared-error (MSE) distortion and the Wasserstein-2 perception index. We prove
that the DP function is always quadratic, regardless of the underlying
distribution. This stems from the fact that estimators on the DP curve form a
geodesic in Wasserstein space. In the Gaussian setting, we further provide a
closed form expression for such estimators. For general distributions, we show
how these estimators can be constructed from the estimators at the two extremes
of the tradeoff: The global MSE minimizer, and a minimizer of the MSE under a
perfect perceptual quality constraint. The latter can be obtained as a
stochastic transformation of the former.
Related papers
- Rejection via Learning Density Ratios [50.91522897152437]
Classification with rejection emerges as a learning paradigm which allows models to abstain from making predictions.
We propose a different distributional perspective, where we seek to find an idealized data distribution which maximizes a pretrained model's performance.
Our framework is tested empirically over clean and noisy datasets.
arXiv Detail & Related papers (2024-05-29T01:32:17Z) - Characterization of the Distortion-Perception Tradeoff for Finite
Channels with Arbitrary Metrics [31.383958289479015]
We study the distortion-perception tradeoff over finite-alphabet channels.
We show that computing the DP function and the optimal reconstructions is equivalent to solving a set of linear programming problems.
arXiv Detail & Related papers (2024-02-03T21:17:15Z) - Rate-Distortion-Perception Tradeoff Based on the
Conditional-Distribution Perception Measure [33.084834042565895]
We study the rate-distortionperception (RDP) tradeoff for a memoryless source model in the limit of large blocklengths.
Our perception measure is based on a divergence between the distributions of the source and reconstruction sequences conditioned on the encoder output.
arXiv Detail & Related papers (2024-01-22T18:49:56Z) - Statistically Optimal Generative Modeling with Maximum Deviation from the Empirical Distribution [2.1146241717926664]
We show that the Wasserstein GAN, constrained to left-invertible push-forward maps, generates distributions that avoid replication and significantly deviate from the empirical distribution.
Our most important contribution provides a finite-sample lower bound on the Wasserstein-1 distance between the generative distribution and the empirical one.
We also establish a finite-sample upper bound on the distance between the generative distribution and the true data-generating one.
arXiv Detail & Related papers (2023-07-31T06:11:57Z) - Hinge-Wasserstein: Estimating Multimodal Aleatoric Uncertainty in Regression Tasks [9.600416563894658]
We study regression from images to parameter values and here it is common to detect uncertainty by predicting probability distributions.
Traditional loss functions lead to poor probability distribution estimates and severe overconfidence, in the absence of full ground truth distributions.
We propose hinge-Wasserstein -- a simple improvement of the Wasserstein loss that reduces the penalty for weak secondary modes during training.
arXiv Detail & Related papers (2023-06-01T11:20:09Z) - On the Variance, Admissibility, and Stability of Empirical Risk
Minimization [80.26309576810844]
Empirical Risk Minimization (ERM) with squared loss may attain minimax suboptimal error rates.
We show that under mild assumptions, the suboptimality of ERM must be due to large bias rather than variance.
We also show that our estimates imply stability of ERM, complementing the main result of Caponnetto and Rakhlin (2006) for non-Donsker classes.
arXiv Detail & Related papers (2023-05-29T15:25:48Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Wasserstein Distributionally Robust Estimation in High Dimensions:
Performance Analysis and Optimal Hyperparameter Tuning [0.0]
We propose a Wasserstein distributionally robust estimation framework to estimate an unknown parameter from noisy linear measurements.
We focus on the task of analyzing the squared error performance of such estimators.
We show that the squared error can be recovered as the solution of a convex-concave optimization problem.
arXiv Detail & Related papers (2022-06-27T13:02:59Z) - An Indirect Rate-Distortion Characterization for Semantic Sources:
General Model and the Case of Gaussian Observation [83.93224401261068]
Source model is motivated by the recent surge of interest in the semantic aspect of information.
intrinsic state corresponds to the semantic feature of the source, which in general is not observable.
Rate-distortion function is the semantic rate-distortion function of the source.
arXiv Detail & Related papers (2022-01-29T02:14:24Z) - A Deep Ordinal Distortion Estimation Approach for Distortion Rectification [62.72089758481803]
We propose a novel distortion rectification approach that can obtain more accurate parameters with higher efficiency.
We design a local-global associated estimation network that learns the ordinal distortion to approximate the realistic distortion distribution.
Considering the redundancy of distortion information, our approach only uses a part of distorted image for the ordinal distortion estimation.
arXiv Detail & Related papers (2020-07-21T10:03:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.