Versatile Energy-Based Probabilistic Models for High Energy Physics
- URL: http://arxiv.org/abs/2302.00695v5
- Date: Thu, 18 Jan 2024 07:39:29 GMT
- Title: Versatile Energy-Based Probabilistic Models for High Energy Physics
- Authors: Taoli Cheng, Aaron Courville
- Abstract summary: We build a multi-purpose energy-based probabilistic model for High Energy Physics events at the Large Hadron Collider.
This framework builds on a powerful generative model and describes higher-order inter-particle interactions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As a classical generative modeling approach, energy-based models have the
natural advantage of flexibility in the form of the energy function. Recently,
energy-based models have achieved great success in modeling high-dimensional
data in computer vision and natural language processing. In line with these
advancements, we build a multi-purpose energy-based probabilistic model for
High Energy Physics events at the Large Hadron Collider. This framework builds
on a powerful generative model and describes higher-order inter-particle
interactions. It suits different encoding architectures and builds on implicit
generation. As for applicative aspects, it can serve as a powerful
parameterized event generator for physics simulation, a generic anomalous
signal detector free from spurious correlations, and an augmented event
classifier for particle identification.
Related papers
- Can Kans (re)discover predictive models for Direct-Drive Laser Fusion? [11.261403205522694]
The domain of laser fusion presents a unique and challenging predictive modeling application landscape for machine learning methods.
Data-driven approaches have been successful in the past for achieving desired generalization ability and model interpretation that aligns with physics expectations.
In this work, we present the use of Kolmogorov-Arnold Networks (KANs) as an alternative to PIL for developing a new type of data-driven predictive model.
arXiv Detail & Related papers (2024-09-13T13:48:06Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Advancing Building Energy Modeling with Large Language Models: Exploration and Case Studies [2.8879609855863713]
The rapid progression in artificial intelligence has facilitated the emergence of large language models like ChatGPT.
This paper investigates the innovative integration of large language models with building energy modeling software.
arXiv Detail & Related papers (2024-02-14T21:02:07Z) - Evaluating the diversity and utility of materials proposed by generative
models [38.85523285991743]
We show how one state-of-the-art generative model, the physics-guided crystal generation model, can be used as part of the inverse design process.
Our findings suggest how generative models might be improved to enable better inverse design.
arXiv Detail & Related papers (2023-08-09T14:42:08Z) - A Full Quantum Generative Adversarial Network Model for High Energy Physics Simulations [0.0]
We develop a quantum Generative Adversarial Network (GAN) model for generating downsized eight-pixel calorimeter shower images.
The advantage over previous quantum models is that the model generates real individual images containing pixel energy values.
Results of the full quantum GAN model are compared to hybrid quantum-classical models using a classical discriminator neural network.
arXiv Detail & Related papers (2023-05-12T06:57:31Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Energy Transformer [64.22957136952725]
Our work combines aspects of three promising paradigms in machine learning, namely, attention mechanism, energy-based models, and associative memory.
We propose a novel architecture, called the Energy Transformer (or ET for short), that uses a sequence of attention layers that are purposely designed to minimize a specifically engineered energy function.
arXiv Detail & Related papers (2023-02-14T18:51:22Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - Controllable and Compositional Generation with Latent-Space Energy-Based
Models [60.87740144816278]
Controllable generation is one of the key requirements for successful adoption of deep generative models in real-world applications.
In this work, we use energy-based models (EBMs) to handle compositional generation over a set of attributes.
By composing energy functions with logical operators, this work is the first to achieve such compositionality in generating photo-realistic images of resolution 1024x1024.
arXiv Detail & Related papers (2021-10-21T03:31:45Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.