Proof of the Gaussian maximizers conjecture for the communication
capacity of noisy heterodyne measurements
- URL: http://arxiv.org/abs/2206.02133v1
- Date: Sun, 5 Jun 2022 09:13:43 GMT
- Title: Proof of the Gaussian maximizers conjecture for the communication
capacity of noisy heterodyne measurements
- Authors: A. S. Holevo, S. N. Filippov
- Abstract summary: We provide a proof for a conjecture on optimality of Gaussian encondings for the ultimate communication rate.
Results generalize previous ones and show a drastic difference in the structure of the optimal encoding.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Basing on recently developed convex programming framework in the paper
[arXiv:2204.10626], we provide a proof for a long-standing conjecture on
optimality of Gaussian encondings for the ultimate communication rate of
generalized heterodyne receivers under the oscillator energy constraint. Our
results generalize previous ones (obtained under the assumption of validity of
the energy threshold condition) and show a drastic difference in the structure
of the optimal encoding within and beyond this condition. The core of the proof
in the case beyond the threshold is a new log-Sobolev type inequality, which
relates the generalized Wehrl entropy with the wavefunction gradient.
Related papers
- Taming Nonconvex Stochastic Mirror Descent with General Bregman
Divergence [25.717501580080846]
This paper revisits the convergence of gradient Forward Mirror (SMD) in the contemporary non optimization setting.
For the training, we develop provably convergent algorithms for the problem of linear networks.
arXiv Detail & Related papers (2024-02-27T17:56:49Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Convex Bounds on the Softmax Function with Applications to Robustness
Verification [69.09991317119679]
The softmax function is a ubiquitous component at the output of neural networks and increasingly in intermediate layers as well.
This paper provides convex lower bounds and concave upper bounds on the softmax function, which are compatible with convex optimization formulations for characterizing neural networks and other ML models.
arXiv Detail & Related papers (2023-03-03T05:07:02Z) - Large deviations rates for stochastic gradient descent with strongly
convex functions [11.247580943940916]
We provide a formal framework for the study of general high probability bounds with gradient descent.
We find an upper large deviations bound for SGD with strongly convex functions.
arXiv Detail & Related papers (2022-11-02T09:15:26Z) - Log-Sobolev inequality and proof of Hypothesis of the Gaussian
Maximizers for the capacity of quantum noisy homodyning [0.0]
We give proof that the information-transmission capacity of the approximate position measurement with the oscillator energy constraint is attained on Gaussian encoding.
We hope that this method should work also for other models lying out of the scope of the "threshold condition" ensuring that the upper bound for the capacity as a difference between the maximum and the minimum output entropies is attainable.
arXiv Detail & Related papers (2022-04-22T10:43:22Z) - Convex Analysis of the Mean Field Langevin Dynamics [49.66486092259375]
convergence rate analysis of the mean field Langevin dynamics is presented.
$p_q$ associated with the dynamics allows us to develop a convergence theory parallel to classical results in convex optimization.
arXiv Detail & Related papers (2022-01-25T17:13:56Z) - Optimizing Information-theoretical Generalization Bounds via Anisotropic
Noise in SGLD [73.55632827932101]
We optimize the information-theoretical generalization bound by manipulating the noise structure in SGLD.
We prove that with constraint to guarantee low empirical risk, the optimal noise covariance is the square root of the expected gradient covariance.
arXiv Detail & Related papers (2021-10-26T15:02:27Z) - Interpolation can hurt robust generalization even when there is no noise [76.3492338989419]
We show that avoiding generalization through ridge regularization can significantly improve generalization even in the absence of noise.
We prove this phenomenon for the robust risk of both linear regression and classification and hence provide the first theoretical result on robust overfitting.
arXiv Detail & Related papers (2021-08-05T23:04:15Z) - Conformal field theory from lattice fermions [77.34726150561087]
We provide a rigorous lattice approximation of conformal field theories given in terms of lattice fermions in 1+1-dimensions.
We show how these results lead to explicit error estimates pertaining to the quantum simulation of conformal field theories.
arXiv Detail & Related papers (2021-07-29T08:54:07Z) - On the classical capacity of quantum Gaussian measurement [0.0]
We prove Gaussianity of the average state of the optimal ensemble in general.
We discuss the Hypothesis of Gaussian Maximizers concerning the structure of the ensemble.
arXiv Detail & Related papers (2021-01-02T11:11:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.