EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion
- URL: http://arxiv.org/abs/2310.00049v1
- Date: Fri, 29 Sep 2023 18:00:03 GMT
- Title: EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion
- Authors: Erik Buhmann, Cedric Ewen, Darius A. Faroughy, Tobias Golling, Gregor
Kasieczka, Matthew Leigh, Guillaume Qu\'etant, John Andrew Raine, Debajyoti
Sengupta, David Shih
- Abstract summary: We present two novel methods that generate LHC jets as point clouds efficiently and accurately.
epcjedi and ep both achieve state-of-the-art performance on the top-quark JetNet datasets.
- Score: 0.7255608805275865
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Jets at the LHC, typically consisting of a large number of highly correlated
particles, are a fascinating laboratory for deep generative modeling. In this
paper, we present two novel methods that generate LHC jets as point clouds
efficiently and accurately. We introduce \epcjedi, which combines
score-matching diffusion models with the Equivariant Point Cloud (EPiC)
architecture based on the deep sets framework. This model offers a much faster
alternative to previous transformer-based diffusion models without reducing the
quality of the generated jets. In addition, we introduce \epcfm, the first
permutation equivariant continuous normalizing flow (CNF) for particle cloud
generation. This model is trained with {\it flow-matching}, a scalable and
easy-to-train objective based on optimal transport that directly regresses the
vector fields connecting the Gaussian noise prior to the data distribution. Our
experiments demonstrate that \epcjedi and \epcfm both achieve state-of-the-art
performance on the top-quark JetNet datasets whilst maintaining fast generation
speed. Most notably, we find that the \epcfm model consistently outperforms all
the other generative models considered here across every metric. Finally, we
also introduce two new particle cloud performance metrics: the first based on
the Kullback-Leibler divergence between feature distributions, the second is
the negative log-posterior of a multi-model ParticleNet classifier.
Related papers
- One-Step Diffusion Distillation through Score Implicit Matching [74.91234358410281]
We present Score Implicit Matching (SIM) a new approach to distilling pre-trained diffusion models into single-step generator models.
SIM shows strong empirical performances for one-step generators.
By applying SIM to a leading transformer-based diffusion model, we distill a single-step generator for text-to-image generation.
arXiv Detail & Related papers (2024-10-22T08:17:20Z) - Derivative-Free Guidance in Continuous and Discrete Diffusion Models with Soft Value-Based Decoding [84.3224556294803]
Diffusion models excel at capturing the natural design spaces of images, molecules, DNA, RNA, and protein sequences.
We aim to optimize downstream reward functions while preserving the naturalness of these design spaces.
Our algorithm integrates soft value functions, which looks ahead to how intermediate noisy states lead to high rewards in the future.
arXiv Detail & Related papers (2024-08-15T16:47:59Z) - Learning to Approximate Particle Smoothing Trajectories via Diffusion Generative Models [16.196738720721417]
Learning systems from sparse observations is critical in numerous fields, including biology, finance, and physics.
We introduce a method that integrates conditional particle filtering with ancestral sampling and diffusion models.
We demonstrate the approach in time-series generation and tasks, including vehicle tracking and single-cell RNA sequencing data.
arXiv Detail & Related papers (2024-06-01T21:54:01Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Improving Generative Model-based Unfolding with Schr\"{o}dinger Bridges [14.989614554242229]
Machine learning-based unfolding has enabled unbinned and high-dimensional differential cross section measurements.
We propose to use Schroedinger Bridges and diffusion models to create SBUnfold, an unfolding approach that combines the strengths of both discriminative and generative models.
We show that SBUnfold achieves excellent performance compared to state of the art methods on a synthetic Z+jets dataset.
arXiv Detail & Related papers (2023-08-23T18:01:01Z) - EPiC-GAN: Equivariant Point Cloud Generation for Particle Jets [0.0]
We introduce EPiC-GAN - equivariant point cloud generative adversarial network - which can produce point clouds of variable multiplicity.
EPiC-GAN scales well to large particle multiplicities and achieves high generation fidelity on benchmark jet generation tasks.
arXiv Detail & Related papers (2023-01-17T19:00:00Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Particle Cloud Generation with Message Passing Generative Adversarial
Networks [14.737885252814273]
In high energy physics, jets are collections of correlated particles produced ubiquitously in particle collisions.
Machine-learning-based generative models, such as generative adversarial networks (GANs), have the potential to significantly accelerate LHC jet simulations.
We introduce a new particle cloud dataset (JetNet), and, due to similarities between particle and point clouds, apply to it existing point cloud GANs.
arXiv Detail & Related papers (2021-06-22T04:21:16Z) - Go with the Flows: Mixtures of Normalizing Flows for Point Cloud
Generation and Reconstruction [98.38585659305325]
normalizing flows (NFs) have demonstrated state-of-the-art performance on modeling 3D point clouds.
This work enhances their representational power by applying mixtures of NFs to point clouds.
arXiv Detail & Related papers (2021-06-06T14:25:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.