Posterior Ratio Estimation of Latent Variables
- URL: http://arxiv.org/abs/2002.06410v2
- Date: Thu, 25 Jun 2020 17:49:50 GMT
- Title: Posterior Ratio Estimation of Latent Variables
- Authors: Song Liu, Yulong Zhang, Mingxuan Yi, Mladen Kolar
- Abstract summary: In some applications, we want to compare distributions of random variables that are emphinferred from observations.
We study the problem of estimating the ratio between two posterior probability density functions of a latent variable.
- Score: 14.619879849533662
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Density Ratio Estimation has attracted attention from the machine learning
community due to its ability to compare the underlying distributions of two
datasets. However, in some applications, we want to compare distributions of
random variables that are \emph{inferred} from observations. In this paper, we
study the problem of estimating the ratio between two posterior probability
density functions of a latent variable. Particularly, we assume the posterior
ratio function can be well-approximated by a parametric model, which is then
estimated using observed information and prior samples. We prove the
consistency of our estimator and the asymptotic normality of the estimated
parameters as the number of prior samples tending to infinity. Finally, we
validate our theories using numerical experiments and demonstrate the
usefulness of the proposed method through some real-world applications.
Related papers
- Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Double Debiased Covariate Shift Adaptation Robust to Density-Ratio Estimation [7.8856737627153874]
We propose a doubly robust estimator for covariate shift adaptation via importance weighting.
Our estimator reduces the bias arising from the density ratio estimation errors.
Notably, our estimator remains consistent if either the density ratio estimator or the regression function is consistent.
arXiv Detail & Related papers (2023-10-25T13:38:29Z) - Asymptotics of Bayesian Uncertainty Estimation in Random Features
Regression [1.170951597793276]
We focus on the variance of the posterior predictive distribution (Bayesian model average) and compare itss to that of the risk of the MAP estimator.
They also agree with each other when the number of samples grow faster than any constant multiple of model dimensions.
arXiv Detail & Related papers (2023-06-06T15:36:15Z) - Anomaly Detection with Variance Stabilized Density Estimation [49.46356430493534]
We present a variance-stabilized density estimation problem for maximizing the likelihood of the observed samples.
To obtain a reliable anomaly detector, we introduce a spectral ensemble of autoregressive models for learning the variance-stabilized distribution.
We have conducted an extensive benchmark with 52 datasets, demonstrating that our method leads to state-of-the-art results.
arXiv Detail & Related papers (2023-06-01T11:52:58Z) - Bayesian Hierarchical Models for Counterfactual Estimation [12.159830463756341]
We propose a probabilistic paradigm to estimate a diverse set of counterfactuals.
We treat the perturbations as random variables endowed with prior distribution functions.
A gradient based sampler with superior convergence characteristics efficiently computes the posterior samples.
arXiv Detail & Related papers (2023-01-21T00:21:11Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - Comparing two samples through stochastic dominance: a graphical approach [2.867517731896504]
Non-deterministic measurements are common in real-world scenarios.
We propose an alternative framework to visually compare two samples according to their estimated cumulative distribution functions.
arXiv Detail & Related papers (2022-03-15T13:37:03Z) - Estimating Divergences in High Dimensions [6.172809837529207]
We propose the use of decomposable models for estimating divergences in high dimensional data.
These allow us to factorize the estimated density of the high-dimensional distribution into a product of lower dimensional functions.
We show empirically that estimating the Kullback-Leibler divergence using decomposable models from a maximum likelihood estimator outperforms existing methods for divergence estimation.
arXiv Detail & Related papers (2021-12-08T20:37:28Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z) - Estimating Gradients for Discrete Random Variables by Sampling without
Replacement [93.09326095997336]
We derive an unbiased estimator for expectations over discrete random variables based on sampling without replacement.
We show that our estimator can be derived as the Rao-Blackwellization of three different estimators.
arXiv Detail & Related papers (2020-02-14T14:15:18Z) - Fast approximations in the homogeneous Ising model for use in scene
analysis [61.0951285821105]
We provide accurate approximations that make it possible to numerically calculate quantities needed in inference.
We show that our approximation formulae are scalable and unfazed by the size of the Markov Random Field.
The practical import of our approximation formulae is illustrated in performing Bayesian inference in a functional Magnetic Resonance Imaging activation detection experiment, and also in likelihood ratio testing for anisotropy in the spatial patterns of yearly increases in pistachio tree yields.
arXiv Detail & Related papers (2017-12-06T14:24:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.