A Dimensionality Reduction Method for Finding Least Favorable Priors
with a Focus on Bregman Divergence
- URL: http://arxiv.org/abs/2202.11598v1
- Date: Wed, 23 Feb 2022 16:22:28 GMT
- Title: A Dimensionality Reduction Method for Finding Least Favorable Priors
with a Focus on Bregman Divergence
- Authors: Alex Dytso, Mario Goldenbaum, H. Vincent Poor, Shlomo Shamai (Shitz)
- Abstract summary: This paper develops a dimensionality reduction method that allows us to move the optimization to a finite-dimensional setting with an explicit bound on the dimension.
In order to make progress on the problem, we restrict ourselves to Bayesian risks induced by a relatively large class of loss functions, namely Bregman divergences.
- Score: 108.28566246421742
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A common way of characterizing minimax estimators in point estimation is by
moving the problem into the Bayesian estimation domain and finding a least
favorable prior distribution. The Bayesian estimator induced by a least
favorable prior, under mild conditions, is then known to be minimax. However,
finding least favorable distributions can be challenging due to inherent
optimization over the space of probability distributions, which is
infinite-dimensional. This paper develops a dimensionality reduction method
that allows us to move the optimization to a finite-dimensional setting with an
explicit bound on the dimension. The benefit of this dimensionality reduction
is that it permits the use of popular algorithms such as projected gradient
ascent to find least favorable priors. Throughout the paper, in order to make
progress on the problem, we restrict ourselves to Bayesian risks induced by a
relatively large class of loss functions, namely Bregman divergences.
Related papers
- Sharp detection of low-dimensional structure in probability measures via dimensional logarithmic Sobolev inequalities [0.5592394503914488]
We introduce a method for identifying and approximating a target measure $pi$ as a perturbation of a given reference measure $mu$.
Our main contribution unveils a connection between the emphdimensional logarithmic Sobolev inequality (LSI) and approximations with this ansatz.
arXiv Detail & Related papers (2024-06-18T20:02:44Z) - Distributed Least Squares in Small Space via Sketching and Bias Reduction [0.0]
Matrix sketching is a powerful tool for reducing the size of large data matrices.
We show that these limitations can be circumvented in the distributed setting by designing sketching methods that minimize the bias of the estimator, rather than its error.
In particular, we give a sparse sketching method running in optimal space and current matrix multiplication time, which recovers a nearly-unbiased least squares estimator using two passes over the data.
arXiv Detail & Related papers (2024-05-08T18:16:37Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - Conditional score-based diffusion models for Bayesian inference in
infinite dimensions [4.747324197963405]
We propose a theoretically grounded method for sampling from the posterior of infinite-dimensional inverse problems based on amortized conditional SDMs.
A significant part of our analysis is dedicated to demonstrating that extending infinite-dimensional SDMs to the conditional setting requires careful consideration.
arXiv Detail & Related papers (2023-05-28T15:34:15Z) - Improving the Accuracy of Marginal Approximations in Likelihood-Free
Inference via Localisation [0.0]
A promising approach to high-dimensional likelihood-free inference involves estimating low-dimensional marginal posteriors.
We show that such low-dimensional approximations can be surprisingly poor in practice for seemingly intuitive summary statistic choices.
We suggest an alternative approach to marginal estimation which is easier to implement and automate.
arXiv Detail & Related papers (2022-07-14T04:56:44Z) - Distributed Sketching for Randomized Optimization: Exact
Characterization, Concentration and Lower Bounds [54.51566432934556]
We consider distributed optimization methods for problems where forming the Hessian is computationally challenging.
We leverage randomized sketches for reducing the problem dimensions as well as preserving privacy and improving straggler resilience in asynchronous distributed systems.
arXiv Detail & Related papers (2022-03-18T05:49:13Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.