Density Ratio Estimation via Infinitesimal Classification
- URL: http://arxiv.org/abs/2111.11010v1
- Date: Mon, 22 Nov 2021 06:26:29 GMT
- Title: Density Ratio Estimation via Infinitesimal Classification
- Authors: Kristy Choi, Chenlin Meng, Yang Song, Stefano Ermon
- Abstract summary: We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
- Score: 85.08255198145304
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Density ratio estimation (DRE) is a fundamental machine learning technique
for comparing two probability distributions. However, existing methods struggle
in high-dimensional settings, as it is difficult to accurately compare
probability distributions based on finite samples. In this work we propose
DRE-\infty, a divide-and-conquer approach to reduce DRE to a series of easier
subproblems. Inspired by Monte Carlo methods, we smoothly interpolate between
the two distributions via an infinite continuum of intermediate bridge
distributions. We then estimate the instantaneous rate of change of the bridge
distributions indexed by time (the "time score") -- a quantity defined
analogously to data (Stein) scores -- with a novel time score matching
objective. Crucially, the learned time scores can then be integrated to compute
the desired density ratio. In addition, we show that traditional (Stein) scores
can be used to obtain integration paths that connect regions of high density in
both distributions, improving performance in practice. Empirically, we
demonstrate that our approach performs well on downstream tasks such as mutual
information estimation and energy-based modeling on complex, high-dimensional
datasets.
Related papers
- Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Estimating Joint Probability Distribution With Low-Rank Tensor
Decomposition, Radon Transforms and Dictionaries [3.0892724364965005]
We describe a method for estimating the joint probability density from data samples by assuming that the underlying distribution can be decomposed as a mixture of product densities with few mixture components.
We combine two key ideas: dictionaries to represent 1-D densities, and random projections to estimate the joint distribution from 1-D marginals.
Our algorithm benefits from improved sample complexity over the previous dictionary-based approach by using 1-D marginals for reconstruction.
arXiv Detail & Related papers (2023-04-18T05:37:15Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - Out-of-Distribution Detection with Class Ratio Estimation [4.930817402876787]
Density-based Out-of-distribution (OOD) detection has recently been shown unreliable for the task of detecting OOD images.
We propose to unify density ratio based methods under a novel framework that builds energy-based models and employs differing base distributions.
arXiv Detail & Related papers (2022-06-08T15:20:49Z) - A Unified Framework for Multi-distribution Density Ratio Estimation [101.67420298343512]
Binary density ratio estimation (DRE) provides the foundation for many state-of-the-art machine learning algorithms.
We develop a general framework from the perspective of Bregman minimization divergence.
We show that our framework leads to methods that strictly generalize their counterparts in binary DRE.
arXiv Detail & Related papers (2021-12-07T01:23:20Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Meta-Learning for Relative Density-Ratio Estimation [59.75321498170363]
Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities.
We propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets.
We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
arXiv Detail & Related papers (2021-07-02T02:13:45Z) - Deep Generative Learning via Schr\"{o}dinger Bridge [14.138796631423954]
We learn a generative model via entropy with a Schr"odinger Bridge.
We show that the generative model via Schr"odinger Bridge is comparable with state-of-the-art GANs.
arXiv Detail & Related papers (2021-06-19T03:35:42Z) - DEMI: Discriminative Estimator of Mutual Information [5.248805627195347]
Estimating mutual information between continuous random variables is often intractable and challenging for high-dimensional data.
Recent progress has leveraged neural networks to optimize variational lower bounds on mutual information.
Our approach is based on training a classifier that provides the probability that a data sample pair is drawn from the joint distribution.
arXiv Detail & Related papers (2020-10-05T04:19:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.