CaloFlow II: Even Faster and Still Accurate Generation of Calorimeter
Showers with Normalizing Flows
- URL: http://arxiv.org/abs/2110.11377v2
- Date: Fri, 5 May 2023 09:03:45 GMT
- Title: CaloFlow II: Even Faster and Still Accurate Generation of Calorimeter
Showers with Normalizing Flows
- Authors: Claudius Krause and David Shih
- Abstract summary: Recently, we introduced CaloFlow, a high-fidelity generative model for GEANT4 calorimeter shower emulation based on normalizing flows.
Here, we present CaloFlow v2, an improvement on our original framework that speeds up shower generation by a further factor of 500 relative to the original.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, we introduced CaloFlow, a high-fidelity generative model for GEANT4
calorimeter shower emulation based on normalizing flows. Here, we present
CaloFlow v2, an improvement on our original framework that speeds up shower
generation by a further factor of 500 relative to the original. The improvement
is based on a technique called Probability Density Distillation, originally
developed for speech synthesis in the ML literature, and which we develop
further by introducing a set of powerful new loss terms. We demonstrate that
CaloFlow v2 preserves the same high fidelity of the original using qualitative
(average images, histograms of high level features) and quantitative
(classifier metric between GEANT4 and generated samples) measures. The result
is a generative model for calorimeter showers that matches the state-of-the-art
in speed (a factor of $10^4$ faster than GEANT4) and greatly surpasses the
previous state-of-the-art in fidelity.
Related papers
- Align Your Flow: Scaling Continuous-Time Flow Map Distillation [63.927438959502226]
Flow maps connect any two noise levels in a single step and remain effective across all step counts.<n>We extensively validate our flow map models, called Align Your Flow, on challenging image generation benchmarks.<n>We show text-to-image flow map models that outperform all existing non-adversarially trained few-step samplers in text-conditioned synthesis.
arXiv Detail & Related papers (2025-06-17T15:06:07Z) - Mean Flows for One-step Generative Modeling [64.4997821467102]
We propose a principled and effective framework for one-step generative modeling.<n>A well-defined identity between average and instantaneous velocities is derived and used to guide neural network training.<n>Our method, termed the MeanFlow model, is self-contained and requires no pre-training, distillation, or curriculum learning.
arXiv Detail & Related papers (2025-05-19T17:59:42Z) - Gaussian Mixture Flow Matching Models [51.976452482535954]
Diffusion models approximate the denoising distribution as a Gaussian and predict its mean, whereas flow matching models re parameterize the Gaussian mean as flow velocity.
They underperform in few-step sampling due to discretization error and tend to produce over-saturated colors under classifier-free guidance (CFG)
We introduce a novel probabilistic guidance scheme that mitigates the over-saturation issues of CFG and improves image generation quality.
arXiv Detail & Related papers (2025-04-07T17:59:42Z) - Jet: A Modern Transformer-Based Normalizing Flow [62.2573739835562]
We revisit the design of the coupling-based normalizing flow models by carefully ablating prior design choices.
We achieve state-of-the-art quantitative and qualitative performance with a much simpler architecture.
arXiv Detail & Related papers (2024-12-19T18:09:42Z) - CaloChallenge 2022: A Community Challenge for Fast Calorimeter Simulation [22.42342223406944]
We present the results of the "Fast Calorimeter Simulation Challenge 2022" - the CaloChallenge.
We study state-of-the-art generative models on four calorimeter shower datasets of increasing dimensionality.
arXiv Detail & Related papers (2024-10-28T23:28:07Z) - One-Step Diffusion Distillation through Score Implicit Matching [74.91234358410281]
We present Score Implicit Matching (SIM) a new approach to distilling pre-trained diffusion models into single-step generator models.
SIM shows strong empirical performances for one-step generators.
By applying SIM to a leading transformer-based diffusion model, we distill a single-step generator for text-to-image generation.
arXiv Detail & Related papers (2024-10-22T08:17:20Z) - FlowTurbo: Towards Real-time Flow-Based Image Generation with Velocity Refiner [70.90505084288057]
Flow-based models tend to produce a straighter sampling trajectory during the sampling process.
We introduce several techniques including a pseudo corrector and sample-aware compilation to further reduce inference time.
FlowTurbo reaches an FID of 2.12 on ImageNet with 100 (ms / img) and FID of 3.93 with 38 (ms / img)
arXiv Detail & Related papers (2024-09-26T17:59:51Z) - T2V-Turbo: Breaking the Quality Bottleneck of Video Consistency Model with Mixed Reward Feedback [111.40967379458752]
We introduce T2V-Turbo, which integrates feedback from a mixture of differentiable reward models into the consistency distillation process of a pre-trained T2V model.
Remarkably, the 4-step generations from our T2V-Turbo achieve the highest total score on VBench, even surpassing Gen-2 and Pika.
arXiv Detail & Related papers (2024-05-29T04:26:17Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - CaloClouds II: Ultra-Fast Geometry-Independent Highly-Granular
Calorimeter Simulation [0.0]
Generative machine learning models have been shown to speed up and augment the traditional simulation chain in physics analysis.
A major advancement is the recently introduced CaloClouds model, which generates calorimeter showers as point clouds for the electromagnetic calorimeter of the envisioned International Large Detector (ILD)
In this work, we introduce CaloClouds II which features a number of key improvements. This includes continuous time score-based modelling, which allows for a 25-step sampling with comparable fidelity to CaloClouds while yielding a $6times$ speed-up over Geant4 on a single CPU.
arXiv Detail & Related papers (2023-09-11T18:00:02Z) - Inductive Simulation of Calorimeter Showers with Normalizing Flows [0.0]
iCaloFlow is a framework for fast detector simulation based on an inductive series of normalizing flows trained on the pattern of energy depositions in pairs of consecutive calorimeter layers.
As we demonstrate, iCaloFlow can realize the potential of normalizing flows in performing fast, high-fidelity simulation on detector geometries that are 10 - 100 times higher than previously considered.
arXiv Detail & Related papers (2023-05-19T18:00:00Z) - Q-Diffusion: Quantizing Diffusion Models [52.978047249670276]
Post-training quantization (PTQ) is considered a go-to compression method for other tasks.
We propose a novel PTQ method specifically tailored towards the unique multi-timestep pipeline and model architecture.
We show that our proposed method is able to quantize full-precision unconditional diffusion models into 4-bit while maintaining comparable performance.
arXiv Detail & Related papers (2023-02-08T19:38:59Z) - CaloFlow for CaloChallenge Dataset 1 [0.0]
CaloFlow is a new and promising approach to fast calorimeter simulation based on normalizing flows.
We show how it can produce high-fidelity samples with a sampling time that is several orders of magnitude faster than Geant4.
arXiv Detail & Related papers (2022-10-25T18:00:25Z) - GMFlow: Learning Optical Flow via Global Matching [124.57850500778277]
We propose a GMFlow framework for learning optical flow estimation.
It consists of three main components: a customized Transformer for feature enhancement, a correlation and softmax layer for global feature matching, and a self-attention layer for flow propagation.
Our new framework outperforms 32-iteration RAFT's performance on the challenging Sintel benchmark.
arXiv Detail & Related papers (2021-11-26T18:59:56Z) - CaloFlow: Fast and Accurate Generation of Calorimeter Showers with
Normalizing Flows [0.0]
We introduce CaloFlow, a fast detector simulation framework based on normalizing flows.
For the first time, we demonstrate that normalizing flows can reproduce many-channel calorimeter showers with extremely high fidelity.
arXiv Detail & Related papers (2021-06-09T18:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.