Geometric Priors for Scientific Generative Models in Inertial
Confinement Fusion
- URL: http://arxiv.org/abs/2111.12798v1
- Date: Wed, 24 Nov 2021 21:06:36 GMT
- Title: Geometric Priors for Scientific Generative Models in Inertial
Confinement Fusion
- Authors: Ankita Shukla, Rushil Anirudh, Eugene Kur, Jayaraman J. Thiagarajan,
Peer-Timo Bremer, Brian K. Spears, Tammy Ma, Pavan Turaga
- Abstract summary: We develop a Wasserstein autoencoder (WAE) with a hyperspherical prior for multimodal data.
We exploit a known relationship between the modalities in the dataset as a scientific constraint, and study different properties of the proposed model.
- Score: 32.1427322437781
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we develop a Wasserstein autoencoder (WAE) with a
hyperspherical prior for multimodal data in the application of inertial
confinement fusion. Unlike a typical hyperspherical generative model that
requires computationally inefficient sampling from distributions like the von
Mis Fisher, we sample from a normal distribution followed by a projection layer
before the generator. Finally, to determine the validity of the generated
samples, we exploit a known relationship between the modalities in the dataset
as a scientific constraint, and study different properties of the proposed
model.
Related papers
- A Likelihood Based Approach to Distribution Regression Using Conditional Deep Generative Models [6.647819824559201]
We study the large-sample properties of a likelihood-based approach for estimating conditional deep generative models.
Our results lead to the convergence rate of a sieve maximum likelihood estimator for estimating the conditional distribution.
arXiv Detail & Related papers (2024-10-02T20:46:21Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Learning Joint Latent Space EBM Prior Model for Multi-layer Generator [44.4434704520236]
We study the fundamental problem of learning multi-layer generator models.
We propose an energy-based model (EBM) on the joint latent space over all layers of latent variables.
Our experiments demonstrate that the learned model can be expressive in generating high-quality images.
arXiv Detail & Related papers (2023-06-10T00:27:37Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Generating High Fidelity Synthetic Data via Coreset selection and
Entropic Regularization [15.866662428675054]
We propose using a combination of coresets selection methods and entropic regularization'' to select the highest fidelity samples.
In a semi-supervised learning scenario, we show that augmenting the labeled data-set, by adding our selected subset of samples, leads to better accuracy improvement.
arXiv Detail & Related papers (2023-01-31T22:59:41Z) - An Energy-Based Prior for Generative Saliency [62.79775297611203]
We propose a novel generative saliency prediction framework that adopts an informative energy-based model as a prior distribution.
With the generative saliency model, we can obtain a pixel-wise uncertainty map from an image, indicating model confidence in the saliency prediction.
Experimental results show that our generative saliency model with an energy-based prior can achieve not only accurate saliency predictions but also reliable uncertainty maps consistent with human perception.
arXiv Detail & Related papers (2022-04-19T10:51:00Z) - Generative models and Bayesian inversion using Laplace approximation [0.3670422696827525]
Recently, inverse problems were solved using generative models as highly informative priors.
We show that derived Bayes estimates are consistent, in contrast to the approach employing the low-dimensional manifold of the generative model.
arXiv Detail & Related papers (2022-03-15T10:05:43Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.