Inference and De-Noising of Non-Gaussian Particle Distribution
Functions: A Generative Modeling Approach
- URL: http://arxiv.org/abs/2110.02153v1
- Date: Tue, 5 Oct 2021 16:38:04 GMT
- Title: Inference and De-Noising of Non-Gaussian Particle Distribution
Functions: A Generative Modeling Approach
- Authors: John Donaghy, Kai Germaschewski
- Abstract summary: Inference on data produced by numerical simulations generally consists of binning the data to recover the particle distribution function.
Here we demonstrate the use of normalizing flows to learn a smooth, tractable approximation to the noisy particle distribution function.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The particle-in-cell numerical method of plasma physics balances a trade-off
between computational cost and intrinsic noise. Inference on data produced by
these simulations generally consists of binning the data to recover the
particle distribution function, from which physical processes may be
investigated. In addition to containing noise, the distribution function is
temporally dynamic and can be non-gaussian and multi-modal, making the task of
modeling it difficult. Here we demonstrate the use of normalizing flows to
learn a smooth, tractable approximation to the noisy particle distribution
function. We demonstrate that the resulting data driven likelihood conserves
relevant physics and may be extended to encapsulate the temporal evolution of
the distribution function.
Related papers
- Inferring biological processes with intrinsic noise from cross-sectional data [0.8192907805418583]
Inferring dynamical models from data continues to be a significant challenge in computational biology.
We show that probability flow inference (PFI) disentangles force from intrinsicity while retaining the algorithmic ease of ODE inference.
In practical applications, we show that PFI enables accurate parameter and force estimation in high-dimensional reaction networks, and that it allows inference of cell differentiation dynamics with molecular noise.
arXiv Detail & Related papers (2024-10-10T00:33:25Z) - Electrostatics-based particle sampling and approximate inference [0.0]
A new particle-based sampling and approximate inference method, based on electrostatics and Newton mechanics principles, is introduced.
A discrete-time, discrete-space algorithmic design is provided for usage in more general inference problems.
arXiv Detail & Related papers (2024-06-28T16:53:06Z) - Learning to Approximate Particle Smoothing Trajectories via Diffusion Generative Models [16.196738720721417]
Learning systems from sparse observations is critical in numerous fields, including biology, finance, and physics.
We introduce a method that integrates conditional particle filtering with ancestral sampling and diffusion models.
We demonstrate the approach in time-series generation and tasks, including vehicle tracking and single-cell RNA sequencing data.
arXiv Detail & Related papers (2024-06-01T21:54:01Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Modeling Temporal Data as Continuous Functions with Stochastic Process
Diffusion [2.2849153854336763]
temporal data can be viewed as discretized measurements of the underlying function.
To build a generative model for such data we have to model the process that governs it.
We propose a solution by defining the denoising diffusion model in the function space.
arXiv Detail & Related papers (2022-11-04T17:02:01Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.