Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels
- URL: http://arxiv.org/abs/2105.04538v1
- Date: Mon, 10 May 2021 17:42:01 GMT
- Title: Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels
- Authors: Yufan Zhou, Changyou Chen, Jinhui Xu
- Abstract summary: We introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space.
Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function.
- Score: 67.81799703916563
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning high-dimensional distributions is an important yet challenging
problem in machine learning with applications in various domains. In this
paper, we introduce new techniques to formulate the problem as solving
Fokker-Planck equation in a lower-dimensional latent space, aiming to mitigate
challenges in high-dimensional data space. Our proposed model consists of
latent-distribution morphing, a generator and a parameterized Fokker-Planck
kernel function. One fascinating property of our model is that it can be
trained with arbitrary steps of latent distribution morphing or even without
morphing, which makes it flexible and as efficient as Generative Adversarial
Networks (GANs). Furthermore, this property also makes our latent-distribution
morphing an efficient plug-and-play scheme, thus can be used to improve
arbitrary GANs, and more interestingly, can effectively correct failure cases
of the GAN models. Extensive experiments illustrate the advantages of our
proposed method over existing models.
Related papers
- Diffusion Policies for Generative Modeling of Spacecraft Trajectories [1.2074552857379275]
A key shortcoming in current machine learning-based methods for trajectory generation is that they require large datasets.
In this work, we leverage compositional diffusion modeling to efficiently adapt out-of-distribution data.
We demonstrate the capability of compositional diffusion models for inference-time 6 DoF minimum-fuel landing site selection.
arXiv Detail & Related papers (2025-01-01T18:22:37Z) - Learnable Infinite Taylor Gaussian for Dynamic View Rendering [55.382017409903305]
This paper introduces a novel approach based on a learnable Taylor Formula to model the temporal evolution of Gaussians.
The proposed method achieves state-of-the-art performance in this domain.
arXiv Detail & Related papers (2024-12-05T16:03:37Z) - DiffSG: A Generative Solver for Network Optimization with Diffusion Model [75.27274046562806]
Diffusion generative models can consider a broader range of solutions and exhibit stronger generalization by learning parameters.
We propose a new framework, which leverages intrinsic distribution learning of diffusion generative models to learn high-quality solutions.
arXiv Detail & Related papers (2024-08-13T07:56:21Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Diffusion-Based Neural Network Weights Generation [80.89706112736353]
D2NWG is a diffusion-based neural network weights generation technique that efficiently produces high-performing weights for transfer learning.
Our method extends generative hyper-representation learning to recast the latent diffusion paradigm for neural network weights generation.
Our approach is scalable to large architectures such as large language models (LLMs), overcoming the limitations of current parameter generation techniques.
arXiv Detail & Related papers (2024-02-28T08:34:23Z) - Generative Neural Fields by Mixtures of Neural Implicit Functions [43.27461391283186]
We propose a novel approach to learning the generative neural fields represented by linear combinations of implicit basis networks.
Our algorithm learns basis networks in the form of implicit neural representations and their coefficients in a latent space by either conducting meta-learning or adopting auto-decoding paradigms.
arXiv Detail & Related papers (2023-10-30T11:41:41Z) - A Metaheuristic for Amortized Search in High-Dimensional Parameter
Spaces [0.0]
We propose a new metaheuristic that drives dimensionality reductions from feature-informed transformations.
DR-FFIT implements an efficient sampling strategy that facilitates a gradient-free parameter search in high-dimensional spaces.
Our test data show that DR-FFIT boosts the performances of random-search and simulated-annealing against well-established metaheuristics.
arXiv Detail & Related papers (2023-09-28T14:25:14Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.