Minimizing robust density power-based divergences for general parametric
density models
- URL: http://arxiv.org/abs/2307.05251v4
- Date: Thu, 8 Feb 2024 10:57:13 GMT
- Title: Minimizing robust density power-based divergences for general parametric
density models
- Authors: Akifumi Okuno
- Abstract summary: We introduce an approach to minimize Density power divergence (DPD) for general parametric densities.
The proposed approach can also be employed to minimize other density power-based $gamma$-divergences.
- Score: 3.0277213703725767
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Density power divergence (DPD) is designed to robustly estimate the
underlying distribution of observations, in the presence of outliers. However,
DPD involves an integral of the power of the parametric density models to be
estimated; the explicit form of the integral term can be derived only for
specific densities, such as normal and exponential densities. While we may
perform a numerical integration for each iteration of the optimization
algorithms, the computational complexity has hindered the practical application
of DPD-based estimation to more general parametric densities. To address the
issue, this study introduces a stochastic approach to minimize DPD for general
parametric density models. The proposed approach can also be employed to
minimize other density power-based $\gamma$-divergences, by leveraging
unnormalized models. We provide \verb|R| package for implementation of the
proposed approach in \url{https://github.com/oknakfm/sgdpd}.
Related papers
- Inflationary Flows: Calibrated Bayesian Inference with Diffusion-Based Models [0.0]
We show how diffusion-based models can be repurposed for performing principled, identifiable Bayesian inference.
We show how such maps can be learned via standard DBM training using a novel noise schedule.
The result is a class of highly expressive generative models, uniquely defined on a low-dimensional latent space.
arXiv Detail & Related papers (2024-07-11T19:58:19Z) - Density Estimation via Binless Multidimensional Integration [45.21975243399607]
We introduce the Binless Multidimensional Thermodynamic Integration (BMTI) method for nonparametric, robust, and data-efficient density estimation.
BMTI estimates the logarithm of the density by initially computing log-density differences between neighbouring data points.
The method is tested on a variety of complex synthetic high-dimensional datasets, and is benchmarked on realistic datasets from the chemical physics literature.
arXiv Detail & Related papers (2024-07-10T23:45:20Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - Strong posterior contraction rates via Wasserstein dynamics [8.479040075763892]
In Bayesian statistics, posterior contraction rates (PCRs) quantify the speed at which the posterior distribution concentrates on arbitrarily small neighborhoods of a true model.
We develop a new approach to PCRs, with respect to strong norm distances on parameter spaces of functions.
arXiv Detail & Related papers (2022-03-21T06:53:35Z) - A Model for Multi-View Residual Covariances based on Perspective
Deformation [88.21738020902411]
We derive a model for the covariance of the visual residuals in multi-view SfM, odometry and SLAM setups.
We validate our model with synthetic and real data and integrate it into photometric and feature-based Bundle Adjustment.
arXiv Detail & Related papers (2022-02-01T21:21:56Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - A likelihood approach to nonparametric estimation of a singular
distribution using deep generative models [4.329951775163721]
We investigate a likelihood approach to nonparametric estimation of a singular distribution using deep generative models.
We prove that a novel and effective solution exists by perturbing the data with an instance noise.
We also characterize the class of distributions that can be efficiently estimated via deep generative models.
arXiv Detail & Related papers (2021-05-09T23:13:58Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - Conditional Density Estimation via Weighted Logistic Regressions [0.30458514384586394]
We propose a novel parametric conditional density estimation method by showing the connection between the general density and the likelihood function of inhomogeneous process models.
The maximum likelihood estimates can be obtained via weighted logistic regressions, and the computation can be significantly relaxed by combining a block-wise alternating scheme and local case-control sampling.
arXiv Detail & Related papers (2020-10-21T11:08:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.