Independent finite approximations for Bayesian nonparametric inference
- URL: http://arxiv.org/abs/2009.10780v4
- Date: Sun, 5 Nov 2023 17:00:48 GMT
- Title: Independent finite approximations for Bayesian nonparametric inference
- Authors: Tin D. Nguyen, Jonathan Huggins, Lorenzo Masoero, Lester Mackey,
Tamara Broderick
- Abstract summary: We propose a recipe to construct practical finite-dimensional approximations for homogeneous random measures.
We upper bound the approximation error of AIFAs for a wide class of common CRMs and NCRMs.
We prove that, for worst-case choices of observation likelihoods, TFAs are more efficient than AIFAs.
- Score: 30.367795444044788
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Completely random measures (CRMs) and their normalizations (NCRMs) offer
flexible models in Bayesian nonparametrics. But their infinite dimensionality
presents challenges for inference. Two popular finite approximations are
truncated finite approximations (TFAs) and independent finite approximations
(IFAs). While the former have been well-studied, IFAs lack similarly general
bounds on approximation error, and there has been no systematic comparison
between the two options. In the present work, we propose a general recipe to
construct practical finite-dimensional approximations for homogeneous CRMs and
NCRMs, in the presence or absence of power laws. We call our construction the
automated independent finite approximation (AIFA). Relative to TFAs, we show
that AIFAs facilitate more straightforward derivations and use of parallel
computing in approximate inference. We upper bound the approximation error of
AIFAs for a wide class of common CRMs and NCRMs -- and thereby develop
guidelines for choosing the approximation level. Our lower bounds in key cases
suggest that our upper bounds are tight. We prove that, for worst-case choices
of observation likelihoods, TFAs are more efficient than AIFAs. Conversely, we
find that in real-data experiments with standard likelihoods, AIFAs and TFAs
perform similarly. Moreover, we demonstrate that AIFAs can be used for
hyperparameter estimation even when other potential IFA options struggle or do
not apply.
Related papers
- A Correlation-induced Finite Difference Estimator [6.054123928890574]
We first provide a sample-driven method via the bootstrap technique to estimate the optimal perturbation, and then propose an efficient FD estimator based on correlated samples at the estimated optimal perturbation.
Numerical results confirm the efficiency of our estimators and align well with the theory presented, especially in scenarios with small sample sizes.
arXiv Detail & Related papers (2024-05-09T09:27:18Z) - Latent Feature Relation Consistency for Adversarial Robustness [80.24334635105829]
misclassification will occur when deep neural networks predict adversarial examples which add human-imperceptible adversarial noise to natural examples.
We propose textbfLatent textbfFeature textbfRelation textbfConsistency (textbfLFRC)
LFRC constrains the relation of adversarial examples in latent space to be consistent with the natural examples.
arXiv Detail & Related papers (2023-03-29T13:50:01Z) - Stable Probability Weighting: Large-Sample and Finite-Sample Estimation
and Inference Methods for Heterogeneous Causal Effects of Multivalued
Treatments Under Limited Overlap [0.0]
I propose new practical large-sample and finite-sample methods for estimating and inferring heterogeneous causal effects.
I develop a general principle called "Stable Probability Weighting"
I also propose new finite-sample inference methods for testing a general class of weak null hypotheses.
arXiv Detail & Related papers (2023-01-13T18:52:18Z) - Asymptotically Unbiased Instance-wise Regularized Partial AUC
Optimization: Theory and Algorithm [101.44676036551537]
One-way Partial AUC (OPAUC) and Two-way Partial AUC (TPAUC) measures the average performance of a binary classifier.
Most of the existing methods could only optimize PAUC approximately, leading to inevitable biases that are not controllable.
We present a simpler reformulation of the PAUC problem via distributional robust optimization AUC.
arXiv Detail & Related papers (2022-10-08T08:26:22Z) - A New Central Limit Theorem for the Augmented IPW Estimator: Variance
Inflation, Cross-Fit Covariance and Beyond [0.9172870611255595]
Cross-fit inverse probability weighting (AIPW) with cross-fitting is a popular choice in practice.
We study this cross-fit AIPW estimator under well-specified outcome regression and propensity score models in a high-dimensional regime.
Our work utilizes a novel interplay between three distinct tools--approximate message passing theory, the theory of deterministic equivalents, and the leave-one-out approach.
arXiv Detail & Related papers (2022-05-20T14:17:53Z) - Learning to Estimate Without Bias [57.82628598276623]
Gauss theorem states that the weighted least squares estimator is a linear minimum variance unbiased estimation (MVUE) in linear models.
In this paper, we take a first step towards extending this result to non linear settings via deep learning with bias constraints.
A second motivation to BCE is in applications where multiple estimates of the same unknown are averaged for improved performance.
arXiv Detail & Related papers (2021-10-24T10:23:51Z) - Fine-tuning is Fine in Federated Learning [3.222802562733787]
We study the performance of federated learning algorithms and their variants in an framework.
This multi-criterion approach naturally models the high-dimensional, many-tuned nature of federated learning.
arXiv Detail & Related papers (2021-08-16T18:59:24Z) - Federated Deep AUC Maximization for Heterogeneous Data with a Constant
Communication Complexity [77.78624443410216]
We propose improved FDAM algorithms for detecting heterogeneous chest data.
A result of this paper is that the communication of the proposed algorithm is strongly independent of the number of machines and also independent of the accuracy level.
Experiments have demonstrated the effectiveness of our FDAM algorithm on benchmark datasets and on medical chest Xray images from different organizations.
arXiv Detail & Related papers (2021-02-09T04:05:19Z) - $\gamma$-ABC: Outlier-Robust Approximate Bayesian Computation Based on a
Robust Divergence Estimator [95.71091446753414]
We propose to use a nearest-neighbor-based $gamma$-divergence estimator as a data discrepancy measure.
Our method achieves significantly higher robustness than existing discrepancy measures.
arXiv Detail & Related papers (2020-06-13T06:09:27Z) - Lower bounds in multiple testing: A framework based on derandomized
proxies [107.69746750639584]
This paper introduces an analysis strategy based on derandomization, illustrated by applications to various concrete models.
We provide numerical simulations of some of these lower bounds, and show a close relation to the actual performance of the Benjamini-Hochberg (BH) algorithm.
arXiv Detail & Related papers (2020-05-07T19:59:51Z) - Deterministic Approximate EM Algorithm; Application to the Riemann
Approximation EM and the Tempered EM [0.0]
We introduce a theoretical framework, with state-of-the-art convergence guarantees, for any deterministic approximation of the E step.
We analyse theoretically and empirically several approximations that fit into this framework, for intractable E-steps.
We showcase how new non-studied profiles can more successfully escape adversarial initialisations.
arXiv Detail & Related papers (2020-03-23T08:23:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.