On the Computation of the Gaussian Rate-Distortion-Perception Function
- URL: http://arxiv.org/abs/2311.09190v1
- Date: Wed, 15 Nov 2023 18:34:03 GMT
- Title: On the Computation of the Gaussian Rate-Distortion-Perception Function
- Authors: Giuseppe Serra, Photios A. Stavrou, and Marios Kountouris
- Abstract summary: We study the computation of the rate-distortion-perception function (RDPF) for a multivariate Gaussian source under mean squared error (MSE) distortion.
We provide the associated algorithmic realization, as well as the convergence and the rate of convergence characterization.
We corroborate our results with numerical simulations and draw connections to existing results.
- Score: 10.564071872770146
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we study the computation of the rate-distortion-perception
function (RDPF) for a multivariate Gaussian source under mean squared error
(MSE) distortion and, respectively, Kullback-Leibler divergence, geometric
Jensen-Shannon divergence, squared Hellinger distance, and squared
Wasserstein-2 distance perception metrics. To this end, we first characterize
the analytical bounds of the scalar Gaussian RDPF for the aforementioned
divergence functions, also providing the RDPF-achieving forward "test-channel"
realization. Focusing on the multivariate case, we establish that, for
tensorizable distortion and perception metrics, the optimal solution resides on
the vector space spanned by the eigenvector of the source covariance matrix.
Consequently, the multivariate optimization problem can be expressed as a
function of the scalar Gaussian RDPFs of the source marginals, constrained by
global distortion and perception levels. Leveraging this characterization, we
design an alternating minimization scheme based on the block nonlinear
Gauss-Seidel method, which optimally solves the problem while identifying the
Gaussian RDPF-achieving realization. Furthermore, the associated algorithmic
embodiment is provided, as well as the convergence and the rate of convergence
characterization. Lastly, for the "perfect realism" regime, the analytical
solution for the multivariate Gaussian RDPF is obtained. We corroborate our
results with numerical simulations and draw connections to existing results.
Related papers
- Variance-Reducing Couplings for Random Features [57.73648780299374]
Random features (RFs) are a popular technique to scale up kernel methods in machine learning.
We find couplings to improve RFs defined on both Euclidean and discrete input spaces.
We reach surprising conclusions about the benefits and limitations of variance reduction as a paradigm.
arXiv Detail & Related papers (2024-05-26T12:25:09Z) - Analytical Approximation of the ELBO Gradient in the Context of the Clutter Problem [0.0]
We propose an analytical solution for approximating the gradient of the Evidence Lower Bound (ELBO) in variational inference problems.
The proposed method demonstrates good accuracy and rate of convergence together with linear computational complexity.
arXiv Detail & Related papers (2024-04-16T13:19:46Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Rate-Distortion-Perception Tradeoff Based on the
Conditional-Distribution Perception Measure [33.084834042565895]
We study the rate-distortionperception (RDP) tradeoff for a memoryless source model in the limit of large blocklengths.
Our perception measure is based on a divergence between the distributions of the source and reconstruction sequences conditioned on the encoder output.
arXiv Detail & Related papers (2024-01-22T18:49:56Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Stochastic Mirror Descent for Large-Scale Sparse Recovery [13.500750042707407]
We discuss an application of quadratic Approximation to statistical estimation of high-dimensional sparse parameters.
We show that the proposed algorithm attains the optimal convergence of the estimation error under weak assumptions on the regressor distribution.
arXiv Detail & Related papers (2022-10-23T23:23:23Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Semiparametric Nonlinear Bipartite Graph Representation Learning with
Provable Guarantees [106.91654068632882]
We consider the bipartite graph and formalize its representation learning problem as a statistical estimation problem of parameters in a semiparametric exponential family distribution.
We show that the proposed objective is strongly convex in a neighborhood around the ground truth, so that a gradient descent-based method achieves linear convergence rate.
Our estimator is robust to any model misspecification within the exponential family, which is validated in extensive experiments.
arXiv Detail & Related papers (2020-03-02T16:40:36Z) - Inverses of Matern Covariances on Grids [0.0]
We study the properties of a popular approximation based on partial differential equations on a regular grid of points.
We find that it assigns too much power at high frequencies and does not provide increasingly accurate approximations to the inverse as the grid spacing goes to zero.
In a simulation study, we investigate the implications for parameter estimation, finding that the SPDE approximation tends to overestimate spatial range parameters.
arXiv Detail & Related papers (2019-12-26T18:36:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.