Inductive Simulation of Calorimeter Showers with Normalizing Flows
- URL: http://arxiv.org/abs/2305.11934v2
- Date: Tue, 13 Feb 2024 20:09:47 GMT
- Title: Inductive Simulation of Calorimeter Showers with Normalizing Flows
- Authors: Matthew R. Buckley, Claudius Krause, Ian Pang, David Shih
- Abstract summary: iCaloFlow is a framework for fast detector simulation based on an inductive series of normalizing flows trained on the pattern of energy depositions in pairs of consecutive calorimeter layers.
As we demonstrate, iCaloFlow can realize the potential of normalizing flows in performing fast, high-fidelity simulation on detector geometries that are 10 - 100 times higher than previously considered.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Simulating particle detector response is the single most expensive step in
the Large Hadron Collider computational pipeline. Recently it was shown that
normalizing flows can accelerate this process while achieving unprecedented
levels of accuracy, but scaling this approach up to higher resolutions relevant
for future detector upgrades leads to prohibitive memory constraints. To
overcome this problem, we introduce Inductive CaloFlow (iCaloFlow), a framework
for fast detector simulation based on an inductive series of normalizing flows
trained on the pattern of energy depositions in pairs of consecutive
calorimeter layers. We further use a teacher-student distillation to increase
sampling speed without loss of expressivity. As we demonstrate with Datasets 2
and 3 of the CaloChallenge2022, iCaloFlow can realize the potential of
normalizing flows in performing fast, high-fidelity simulation on detector
geometries that are ~ 10 - 100 times higher granularity than previously
considered.
Related papers
- CaloChallenge 2022: A Community Challenge for Fast Calorimeter Simulation [22.42342223406944]
We present the results of the "Fast Calorimeter Simulation Challenge 2022" - the CaloChallenge.
We study state-of-the-art generative models on four calorimeter shower datasets of increasing dimensionality.
arXiv Detail & Related papers (2024-10-28T23:28:07Z) - Convolutional L2LFlows: Generating Accurate Showers in Highly Granular Calorimeters Using Convolutional Normalizing Flows [0.0]
We extend L2LFlows to simulate showers with a 9-times larger profile in the lateral direction.
We introduce convolutional layers and U-Net-type connections, move from masked autoregressive flows to coupling layers.
arXiv Detail & Related papers (2024-05-30T18:25:19Z) - Unifying Simulation and Inference with Normalizing Flows [0.08796261172196743]
We show that two tasks can be unified by using maximum likelihood estimation (MLE) from conditional generative models for energy regression.
Using an ATLAS-like calorimeter simulation, we demonstrate this concept in the context of calorimeter energy calibration.
arXiv Detail & Related papers (2024-04-29T18:00:00Z) - Training Dynamics of Multi-Head Softmax Attention for In-Context Learning: Emergence, Convergence, and Optimality [54.20763128054692]
We study the dynamics of gradient flow for training a multi-head softmax attention model for in-context learning of multi-task linear regression.
We prove that an interesting "task allocation" phenomenon emerges during the gradient flow dynamics.
arXiv Detail & Related papers (2024-02-29T18:43:52Z) - CaloDVAE : Discrete Variational Autoencoders for Fast Calorimeter Shower
Simulation [2.0646127669654826]
Calorimeter simulation is the most computationally expensive part of Monte Carlo generation of samples.
We present a technique based on Discrete Variational Autoencoders (DVAEs) to simulate particle showers in Electromagnetic Calorimeters.
arXiv Detail & Related papers (2022-10-14T00:18:40Z) - Deep Equilibrium Optical Flow Estimation [80.80992684796566]
Recent state-of-the-art (SOTA) optical flow models use finite-step recurrent update operations to emulate traditional algorithms.
These RNNs impose large computation and memory overheads, and are not directly trained to model such stable estimation.
We propose deep equilibrium (DEQ) flow estimators, an approach that directly solves for the flow as the infinite-level fixed point of an implicit layer.
arXiv Detail & Related papers (2022-04-18T17:53:44Z) - GMFlow: Learning Optical Flow via Global Matching [124.57850500778277]
We propose a GMFlow framework for learning optical flow estimation.
It consists of three main components: a customized Transformer for feature enhancement, a correlation and softmax layer for global feature matching, and a self-attention layer for flow propagation.
Our new framework outperforms 32-iteration RAFT's performance on the challenging Sintel benchmark.
arXiv Detail & Related papers (2021-11-26T18:59:56Z) - CaloFlow: Fast and Accurate Generation of Calorimeter Showers with
Normalizing Flows [0.0]
We introduce CaloFlow, a fast detector simulation framework based on normalizing flows.
For the first time, we demonstrate that normalizing flows can reproduce many-channel calorimeter showers with extremely high fidelity.
arXiv Detail & Related papers (2021-06-09T18:00:02Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z) - Self Normalizing Flows [65.73510214694987]
We propose a flexible framework for training normalizing flows by replacing expensive terms in the gradient by learned approximate inverses at each layer.
This reduces the computational complexity of each layer's exact update from $mathcalO(D3)$ to $mathcalO(D2)$.
We show experimentally that such models are remarkably stable and optimize to similar data likelihood values as their exact gradient counterparts.
arXiv Detail & Related papers (2020-11-14T09:51:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.