Disentangling Learning Representations with Density Estimation
- URL: http://arxiv.org/abs/2302.04362v1
- Date: Wed, 8 Feb 2023 22:37:33 GMT
- Title: Disentangling Learning Representations with Density Estimation
- Authors: Eric Yeats, Frank Liu, Hai Li
- Abstract summary: We present a method which achieves reliable disentanglement via flexible density estimation of the latent space.
GCAE achieves highly competitive and reliable disentanglement scores compared with state-of-the-art baselines.
- Score: 9.244163477446799
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Disentangled learning representations have promising utility in many
applications, but they currently suffer from serious reliability issues. We
present Gaussian Channel Autoencoder (GCAE), a method which achieves reliable
disentanglement via flexible density estimation of the latent space. GCAE
avoids the curse of dimensionality of density estimation by disentangling
subsets of its latent space with the Dual Total Correlation (DTC) metric,
thereby representing its high-dimensional latent joint distribution as a
collection of many low-dimensional conditional distributions. In our
experiments, GCAE achieves highly competitive and reliable disentanglement
scores compared with state-of-the-art baselines.
Related papers
- Your copula is a classifier in disguise: classification-based copula density estimation [2.5261465733373965]
We propose reinterpreting copula density estimation as a discriminative task.
We derive equivalences between well-known copula classes and classification problems naturally arising in our interpretation.
We show our estimator achieves theoretical guarantees akin to maximum likelihood estimation.
arXiv Detail & Related papers (2024-11-05T11:25:34Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Density Ratio Estimation via Infinitesimal Classification [85.08255198145304]
We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
arXiv Detail & Related papers (2021-11-22T06:26:29Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Imitation with Neural Density Models [98.34503611309256]
We propose a new framework for Imitation Learning (IL) via density estimation of the expert's occupancy measure followed by Imitation Occupancy Entropy Reinforcement Learning (RL) using the density as a reward.
Our approach maximizes a non-adversarial model-free RL objective that provably lower bounds reverse Kullback-Leibler divergence between occupancy measures of the expert and imitator.
arXiv Detail & Related papers (2020-10-19T19:38:36Z) - Nonparametric Density Estimation from Markov Chains [68.8204255655161]
We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
arXiv Detail & Related papers (2020-09-08T18:33:42Z) - Telescoping Density-Ratio Estimation [21.514983459970903]
We introduce a new framework, telescoping density-ratio estimation (TRE)
TRE enables the estimation of ratios between highly dissimilar densities in high-dimensional spaces.
Our experiments demonstrate that TRE can yield substantial improvements over existing single-ratio methods.
arXiv Detail & Related papers (2020-06-22T12:55:06Z) - High-Dimensional Non-Parametric Density Estimation in Mixed Smooth
Sobolev Spaces [31.663702435594825]
Density estimation plays a key role in many tasks in machine learning, statistical inference, and visualization.
Main bottleneck in high-dimensional density estimation is the prohibitive computational cost and the slow convergence rate.
We propose novel estimators for high-dimensional non-parametric density estimation called the adaptive hyperbolic cross density estimators.
arXiv Detail & Related papers (2020-06-05T21:27:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.