PC-JeDi: Diffusion for Particle Cloud Generation in High Energy Physics
- URL: http://arxiv.org/abs/2303.05376v2
- Date: Wed, 21 Feb 2024 13:38:39 GMT
- Title: PC-JeDi: Diffusion for Particle Cloud Generation in High Energy Physics
- Authors: Matthew Leigh, Debajyoti Sengupta, Guillaume Qu\'etant, John Andrew
Raine, Knut Zoch, and Tobias Golling
- Abstract summary: We present a new method to efficiently generate jets in High Energy Physics called PC-JeDi.
This method uses score-based diffusion models in conjunction with transformers which are well suited to the task of generating jets as particle clouds.
PC-JeDi achieves competitive performance with current state-of-the-art methods across several metrics that evaluate the quality of the generated jets.
- Score: 0.8246494848934447
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we present a new method to efficiently generate jets in High
Energy Physics called PC-JeDi. This method utilises score-based diffusion
models in conjunction with transformers which are well suited to the task of
generating jets as particle clouds due to their permutation equivariance.
PC-JeDi achieves competitive performance with current state-of-the-art methods
across several metrics that evaluate the quality of the generated jets.
Although slower than other models, due to the large number of forward passes
required by diffusion models, it is still substantially faster than traditional
detailed simulation. Furthermore, PC-JeDi uses conditional generation to
produce jets with a desired mass and transverse momentum for two different
particles, top quarks and gluons.
Related papers
- Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.
Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - One-Step Diffusion Distillation through Score Implicit Matching [74.91234358410281]
We present Score Implicit Matching (SIM) a new approach to distilling pre-trained diffusion models into single-step generator models.
SIM shows strong empirical performances for one-step generators.
By applying SIM to a leading transformer-based diffusion model, we distill a single-step generator for text-to-image generation.
arXiv Detail & Related papers (2024-10-22T08:17:20Z) - Dynamic Diffusion Transformer [67.13876021157887]
Diffusion Transformer (DiT) has demonstrated superior performance but suffers from substantial computational costs.
We propose Dynamic Diffusion Transformer (DyDiT), an architecture that dynamically adjusts its computation along both timestep and spatial dimensions during generation.
With 3% additional fine-tuning, our method reduces the FLOPs of DiT-XL by 51%, accelerates generation by 1.73, and achieves a competitive FID score of 2.07 on ImageNet.
arXiv Detail & Related papers (2024-10-04T14:14:28Z) - Flow Matching Beyond Kinematics: Generating Jets with Particle-ID and
Trajectory Displacement Information [0.0]
We introduce the first generative model trained on the JetClass dataset.
Our model generates jets at the constituent level, and it is a permutation-equivariant continuous normalizing flow (CNF) trained with the flow matching technique.
For the first time, we also introduce a generative model that goes beyond the kinematic features of jet constituents.
arXiv Detail & Related papers (2023-11-30T19:00:02Z) - EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion [0.7255608805275865]
We present two novel methods that generate LHC jets as point clouds efficiently and accurately.
epcjedi and ep both achieve state-of-the-art performance on the top-quark JetNet datasets.
arXiv Detail & Related papers (2023-09-29T18:00:03Z) - PC-Droid: Faster diffusion and improved quality for particle cloud
generation [0.9374652839580183]
Building on the success of PC-JeDi we introduce PC-Droid, a substantially improved diffusion model for the generation of jet particle clouds.
By leveraging a new diffusion formulation, studying more recent integration solvers, and training on all jet types simultaneously, we are able to achieve state-of-the-art performance for all types of jets.
arXiv Detail & Related papers (2023-07-13T15:56:23Z) - EPiC-GAN: Equivariant Point Cloud Generation for Particle Jets [0.0]
We introduce EPiC-GAN - equivariant point cloud generative adversarial network - which can produce point clouds of variable multiplicity.
EPiC-GAN scales well to large particle multiplicities and achieves high generation fidelity on benchmark jet generation tasks.
arXiv Detail & Related papers (2023-01-17T19:00:00Z) - Transformer with Implicit Edges for Particle-based Physics Simulation [135.77656965678196]
Transformer with Implicit Edges (TIE) captures the rich semantics of particle interactions in an edge-free manner.
We evaluate our model on diverse domains of varying complexity and materials.
arXiv Detail & Related papers (2022-07-22T03:45:29Z) - Controllable and Compositional Generation with Latent-Space Energy-Based
Models [60.87740144816278]
Controllable generation is one of the key requirements for successful adoption of deep generative models in real-world applications.
In this work, we use energy-based models (EBMs) to handle compositional generation over a set of attributes.
By composing energy functions with logical operators, this work is the first to achieve such compositionality in generating photo-realistic images of resolution 1024x1024.
arXiv Detail & Related papers (2021-10-21T03:31:45Z) - Particle Cloud Generation with Message Passing Generative Adversarial
Networks [14.737885252814273]
In high energy physics, jets are collections of correlated particles produced ubiquitously in particle collisions.
Machine-learning-based generative models, such as generative adversarial networks (GANs), have the potential to significantly accelerate LHC jet simulations.
We introduce a new particle cloud dataset (JetNet), and, due to similarities between particle and point clouds, apply to it existing point cloud GANs.
arXiv Detail & Related papers (2021-06-22T04:21:16Z) - Efficient pre-training objectives for Transformers [84.64393460397471]
We study several efficient pre-training objectives for Transformers-based models.
We prove that eliminating the MASK token and considering the whole output during the loss are essential choices to improve performance.
arXiv Detail & Related papers (2021-04-20T00:09:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.