Flow Matching Beyond Kinematics: Generating Jets with Particle-ID and
Trajectory Displacement Information
- URL: http://arxiv.org/abs/2312.00123v1
- Date: Thu, 30 Nov 2023 19:00:02 GMT
- Title: Flow Matching Beyond Kinematics: Generating Jets with Particle-ID and
Trajectory Displacement Information
- Authors: Joschka Birk, Erik Buhmann, Cedric Ewen, Gregor Kasieczka, David Shih
- Abstract summary: We introduce the first generative model trained on the JetClass dataset.
Our model generates jets at the constituent level, and it is a permutation-equivariant continuous normalizing flow (CNF) trained with the flow matching technique.
For the first time, we also introduce a generative model that goes beyond the kinematic features of jet constituents.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce the first generative model trained on the JetClass dataset. Our
model generates jets at the constituent level, and it is a
permutation-equivariant continuous normalizing flow (CNF) trained with the flow
matching technique. It is conditioned on the jet type, so that a single model
can be used to generate the ten different jet types of JetClass. For the first
time, we also introduce a generative model that goes beyond the kinematic
features of jet constituents. The JetClass dataset includes more features, such
as particle-ID and track impact parameter, and we demonstrate that our CNF can
accurately model all of these additional features as well. Our generative model
for JetClass expands on the versatility of existing jet generation techniques,
enhancing their potential utility in high-energy physics research, and offering
a more comprehensive understanding of the generated jets.
Related papers
- OmniJet-$α$: The first cross-task foundation model for particle physics [0.0]
Foundation models are multi-dataset and multi-task machine learning methods that once pre-trained can be fine-tuned for a variety of downstream applications.
We report significant progress on this challenge on several fronts.
We demonstrate transfer learning between an unsupervised problem (jet generation) and a classic supervised task (jet tagging) with our new OmniJet-$alpha$ model.
arXiv Detail & Related papers (2024-03-08T19:00:01Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion [0.7255608805275865]
We present two novel methods that generate LHC jets as point clouds efficiently and accurately.
epcjedi and ep both achieve state-of-the-art performance on the top-quark JetNet datasets.
arXiv Detail & Related papers (2023-09-29T18:00:03Z) - Towards Efficient Task-Driven Model Reprogramming with Foundation Models [52.411508216448716]
Vision foundation models exhibit impressive power, benefiting from the extremely large model capacity and broad training data.
However, in practice, downstream scenarios may only support a small model due to the limited computational resources or efficiency considerations.
This brings a critical challenge for the real-world application of foundation models: one has to transfer the knowledge of a foundation model to the downstream task.
arXiv Detail & Related papers (2023-04-05T07:28:33Z) - PC-JeDi: Diffusion for Particle Cloud Generation in High Energy Physics [0.8246494848934447]
We present a new method to efficiently generate jets in High Energy Physics called PC-JeDi.
This method uses score-based diffusion models in conjunction with transformers which are well suited to the task of generating jets as particle clouds.
PC-JeDi achieves competitive performance with current state-of-the-art methods across several metrics that evaluate the quality of the generated jets.
arXiv Detail & Related papers (2023-03-09T16:23:49Z) - Particle Transformer for Jet Tagging [4.604003661048267]
We present JetClass, a new comprehensive dataset for jet tagging.
The dataset consists of 100 M jets, about two orders of magnitude larger than existing public datasets.
We propose a new Transformer-based architecture for jet tagging, called Particle Transformer (ParT)
arXiv Detail & Related papers (2022-02-08T10:36:29Z) - Controllable and Compositional Generation with Latent-Space Energy-Based
Models [60.87740144816278]
Controllable generation is one of the key requirements for successful adoption of deep generative models in real-world applications.
In this work, we use energy-based models (EBMs) to handle compositional generation over a set of attributes.
By composing energy functions with logical operators, this work is the first to achieve such compositionality in generating photo-realistic images of resolution 1024x1024.
arXiv Detail & Related papers (2021-10-21T03:31:45Z) - Dual-Teacher Class-Incremental Learning With Data-Free Generative Replay [49.691610143011566]
We propose two novel knowledge transfer techniques for class-incremental learning (CIL)
First, we propose data-free generative replay (DF-GR) to mitigate catastrophic forgetting in CIL by using synthetic samples from a generative model.
Second, we introduce dual-teacher information distillation (DT-ID) for knowledge distillation from two teachers to one student.
arXiv Detail & Related papers (2021-06-17T22:13:15Z) - Generative Flows with Invertible Attentions [135.23766216657745]
We introduce two types of invertible attention mechanisms for generative flow models.
We exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps.
Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models.
arXiv Detail & Related papers (2021-06-07T20:43:04Z) - Autoencoders for unsupervised anomaly detection in high energy physics [105.54048699217668]
We study the tagging of top jet images in a background of QCD jet images.
We show that the standard autoencoder setup cannot be considered as a model-independent anomaly tagger.
We suggest improved performance measures for the task of model-independent anomaly detection.
arXiv Detail & Related papers (2021-04-19T05:06:57Z) - Wind Speed Prediction using Deep Ensemble Learning with a Jet-like
Architecture [0.28675177318965034]
The design of wings, tail, and nose of a jet improves aerodynamics resulting in a smooth and controlled flight of the jet.
The diverse feature spaces of the base-regressors are exploited using the jet-like ensemble architecture.
The proposed DEL-Jet technique is evaluated for ten independent runs and shows that the deep and jet-like architecture helps in improving the robustness and generalization of the learning system.
arXiv Detail & Related papers (2020-02-28T08:33:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.