High-dimensional density estimation with tensorizing flow
- URL: http://arxiv.org/abs/2212.00759v1
- Date: Thu, 1 Dec 2022 18:45:45 GMT
- Title: High-dimensional density estimation with tensorizing flow
- Authors: Yinuo Ren, Hongli Zhao, Yuehaw Khoo, Lexing Ying
- Abstract summary: We propose the tensorizing flow method for estimating high-dimensional probability density functions from the observed data.
The proposed method combines the optimization-less feature of the tensor-train with the flexibility of the flow-based generative models.
- Score: 5.457842083043014
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the tensorizing flow method for estimating high-dimensional
probability density functions from the observed data. The method is based on
tensor-train and flow-based generative modeling. Our method first efficiently
constructs an approximate density in the tensor-train form via solving the
tensor cores from a linear system based on the kernel density estimators of
low-dimensional marginals. We then train a continuous-time flow model from this
tensor-train density to the observed empirical distribution by performing a
maximum likelihood estimation. The proposed method combines the
optimization-less feature of the tensor-train with the flexibility of the
flow-based generative models. Numerical results are included to demonstrate the
performance of the proposed method.
Related papers
- Diffusion Density Estimators [0.0]
We introduce a new, highly parallelizable method that computes log densities without the need to solve a flow.
Our approach is based on estimating a path integral by Monte Carlo, in a manner identical to the simulation-free training of diffusion models.
arXiv Detail & Related papers (2024-10-09T15:21:53Z) - TERM Model: Tensor Ring Mixture Model for Density Estimation [48.622060998018206]
In this paper, we take tensor ring decomposition for density estimator, which significantly reduces the number of permutation candidates.
A mixture model that incorporates multiple permutation candidates with adaptive weights is further designed, resulting in increased expressive flexibility.
This approach acknowledges that suboptimal permutations can offer distinctive information besides that of optimal permutations.
arXiv Detail & Related papers (2023-12-13T11:39:56Z) - Generative Modeling via Hierarchical Tensor Sketching [12.005736675688917]
We propose a hierarchical tensor-network approach for approximating high-dimensional probability density via empirical distribution.
The complexity of the resulting algorithm scales linearly in the dimension of the high-dimensional density.
arXiv Detail & Related papers (2023-04-11T15:55:13Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Tensor-Train Density Estimation [16.414910030716555]
We propose a new efficient tensor train-based model for density estimation (TTDE)
Such density parametrization allows exact sampling, calculation of cumulative and marginal density functions, and partition function.
We show that TTDE significantly outperforms competitors in training speed.
arXiv Detail & Related papers (2021-07-30T21:51:12Z) - Low-rank Characteristic Tensor Density Estimation Part I: Foundations [38.05393186002834]
We propose a novel approach that builds upon tensor factorization tools.
In order to circumvent the curse of dimensionality, we introduce a low-rank model of this characteristic tensor.
We demonstrate the very promising performance of the proposed method using several measured datasets.
arXiv Detail & Related papers (2020-08-27T18:06:19Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z) - Learning Generative Models using Denoising Density Estimators [29.068491722778827]
We introduce a new generative model based on denoising density estimators (DDEs)
Our main contribution is a novel technique to obtain generative models by minimizing the KL-divergence directly.
Experimental results demonstrate substantial improvement in density estimation and competitive performance in generative model training.
arXiv Detail & Related papers (2020-01-08T20:30:40Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.