Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels
- URL: http://arxiv.org/abs/2105.04538v1
- Date: Mon, 10 May 2021 17:42:01 GMT
- Title: Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels
- Authors: Yufan Zhou, Changyou Chen, Jinhui Xu
- Abstract summary: We introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space.
Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function.
- Score: 67.81799703916563
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning high-dimensional distributions is an important yet challenging
problem in machine learning with applications in various domains. In this
paper, we introduce new techniques to formulate the problem as solving
Fokker-Planck equation in a lower-dimensional latent space, aiming to mitigate
challenges in high-dimensional data space. Our proposed model consists of
latent-distribution morphing, a generator and a parameterized Fokker-Planck
kernel function. One fascinating property of our model is that it can be
trained with arbitrary steps of latent distribution morphing or even without
morphing, which makes it flexible and as efficient as Generative Adversarial
Networks (GANs). Furthermore, this property also makes our latent-distribution
morphing an efficient plug-and-play scheme, thus can be used to improve
arbitrary GANs, and more interestingly, can effectively correct failure cases
of the GAN models. Extensive experiments illustrate the advantages of our
proposed method over existing models.
Related papers
- Image Neural Field Diffusion Models [46.781775067944395]
We propose to learn the distribution of continuous images by training diffusion models on image neural fields.
We show that image neural field diffusion models can be trained using mixed-resolution image datasets, outperform fixed-resolution diffusion models, and can solve inverse problems with conditions applied at different scales efficiently.
arXiv Detail & Related papers (2024-06-11T17:24:02Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Generative Neural Fields by Mixtures of Neural Implicit Functions [43.27461391283186]
We propose a novel approach to learning the generative neural fields represented by linear combinations of implicit basis networks.
Our algorithm learns basis networks in the form of implicit neural representations and their coefficients in a latent space by either conducting meta-learning or adopting auto-decoding paradigms.
arXiv Detail & Related papers (2023-10-30T11:41:41Z) - A Metaheuristic for Amortized Search in High-Dimensional Parameter
Spaces [0.0]
We propose a new metaheuristic that drives dimensionality reductions from feature-informed transformations.
DR-FFIT implements an efficient sampling strategy that facilitates a gradient-free parameter search in high-dimensional spaces.
Our test data show that DR-FFIT boosts the performances of random-search and simulated-annealing against well-established metaheuristics.
arXiv Detail & Related papers (2023-09-28T14:25:14Z) - Deep Networks as Denoising Algorithms: Sample-Efficient Learning of
Diffusion Models in High-Dimensional Graphical Models [22.353510613540564]
We investigate the approximation efficiency of score functions by deep neural networks in generative modeling.
We observe score functions can often be well-approximated in graphical models through variational inference denoising algorithms.
We provide an efficient sample complexity bound for diffusion-based generative modeling when the score function is learned by deep neural networks.
arXiv Detail & Related papers (2023-09-20T15:51:10Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z) - Deep Magnification-Flexible Upsampling over 3D Point Clouds [103.09504572409449]
We propose a novel end-to-end learning-based framework to generate dense point clouds.
We first formulate the problem explicitly, which boils down to determining the weights and high-order approximation errors.
Then, we design a lightweight neural network to adaptively learn unified and sorted weights as well as the high-order refinements.
arXiv Detail & Related papers (2020-11-25T14:00:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.