Generative Diffusion Models for Fast Simulations of Particle Collisions at CERN
- URL: http://arxiv.org/abs/2406.03233v1
- Date: Wed, 5 Jun 2024 13:11:53 GMT
- Title: Generative Diffusion Models for Fast Simulations of Particle Collisions at CERN
- Authors: Mikołaj Kita, Jan Dubiński, Przemysław Rokita, Kamil Deja,
- Abstract summary: In High Energy Physics simulations play a crucial role in unraveling the complexities of particle collision experiments within CERN's Large Hadron Collider.
Recent advancements highlight the efficacy of diffusion models as state-of-the-art generative machine learning methods.
We present the first simulation for Zero Degree Calorimeter (ZDC) at the ALICE experiment based on diffusion models, achieving the highest fidelity compared to existing baselines.
- Score: 3.2686289567336235
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In High Energy Physics simulations play a crucial role in unraveling the complexities of particle collision experiments within CERN's Large Hadron Collider. Machine learning simulation methods have garnered attention as promising alternatives to traditional approaches. While existing methods mainly employ Variational Autoencoders (VAEs) or Generative Adversarial Networks (GANs), recent advancements highlight the efficacy of diffusion models as state-of-the-art generative machine learning methods. We present the first simulation for Zero Degree Calorimeter (ZDC) at the ALICE experiment based on diffusion models, achieving the highest fidelity compared to existing baselines. We perform an analysis of trade-offs between generation times and the simulation quality. The results indicate a significant potential of latent diffusion model due to its rapid generation time.
Related papers
- Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.
Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - Unfolding Time: Generative Modeling for Turbulent Flows in 4D [49.843505326598596]
This work introduces a 4D generative diffusion model and a physics-informed guidance technique that enables the generation of realistic sequences of flow states.
Our findings indicate that the proposed method can successfully sample entire subsequences from the turbulent manifold.
This advancement opens doors for the application of generative modeling in analyzing the temporal evolution of turbulent flows.
arXiv Detail & Related papers (2024-06-17T10:21:01Z) - Deep Generative Models for Proton Zero Degree Calorimeter Simulations in ALICE, CERN [3.2686289567336235]
We present an innovative deep learning simulation approach tailored for the proton Zero Degree Calorimeter in the ALICE experiment.
Our method offers a significant speedup when comparing to the traditional Monte-Carlo based approaches.
arXiv Detail & Related papers (2024-06-05T13:41:09Z) - Physics-enhanced Neural Operator for Simulating Turbulent Transport [9.923888452768919]
This paper presents a physics-enhanced neural operator (PENO) that incorporates physical knowledge of partial differential equations (PDEs) to accurately model flow dynamics.
The proposed method is evaluated through its performance on two distinct sets of 3D turbulent flow data.
arXiv Detail & Related papers (2024-05-31T20:05:17Z) - Particle physics DL-simulation with control over generated data properties [3.2686289567336235]
The research of innovative methods aimed at reducing costs and shortening the time needed for simulation has been sparked by the development of collision simulations at CERN.
Deep learning generative methods including VAE, GANs and diffusion models have been used for this purpose.
This work aims to mitigate this issue, by providing an alternative solution to currently employed algorithms by introducing the mechanism of control over the generated data properties.
arXiv Detail & Related papers (2024-05-22T22:39:29Z) - Uncertainty-aware Surrogate Models for Airfoil Flow Simulations with Denoising Diffusion Probabilistic Models [26.178192913986344]
We make a first attempt to use denoising diffusion probabilistic models (DDPMs) to train an uncertainty-aware surrogate model for turbulence simulations.
Our results show DDPMs can successfully capture the whole distribution of solutions and, as a consequence, accurately estimate the uncertainty of the simulations.
We also evaluate an emerging generative modeling variant, flow matching, in comparison to regular diffusion models.
arXiv Detail & Related papers (2023-12-08T19:04:17Z) - Learning Energy-Based Prior Model with Diffusion-Amortized MCMC [89.95629196907082]
Common practice of learning latent space EBMs with non-convergent short-run MCMC for prior and posterior sampling is hindering the model from further progress.
We introduce a simple but effective diffusion-based amortization method for long-run MCMC sampling and develop a novel learning algorithm for the latent space EBM based on it.
arXiv Detail & Related papers (2023-10-05T00:23:34Z) - Machine Learning methods for simulating particle response in the Zero
Degree Calorimeter at the ALICE experiment, CERN [8.980453507536017]
Currently, over half of the computing power at CERN GRID is used to run High Energy Physics simulations.
The recent updates at the Large Hadron Collider (LHC) create the need for developing more efficient simulation methods.
We propose an alternative approach to the problem that leverages machine learning.
arXiv Detail & Related papers (2023-06-23T16:45:46Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.