Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics
- URL: http://arxiv.org/abs/2405.14806v3
- Date: Wed, 30 Oct 2024 15:50:21 GMT
- Title: Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics
- Authors: Jonas Spinner, Victor Bresó, Pim de Haan, Tilman Plehn, Jesse Thaler, Johann Brehmer,
- Abstract summary: Lorentz Geometric Algebra Transformer (L-GATr) is a new multi-purpose architecture for high-energy physics.
L-GATr is first demonstrated on regression and classification tasks from particle physics.
We then construct the first Lorentz-equivariant generative model: a continuous normalizing flow based on an L-GATr network.
- Score: 4.4970885242855845
- License:
- Abstract: Extracting scientific understanding from particle-physics experiments requires solving diverse learning problems with high precision and good data efficiency. We propose the Lorentz Geometric Algebra Transformer (L-GATr), a new multi-purpose architecture for high-energy physics. L-GATr represents high-energy data in a geometric algebra over four-dimensional space-time and is equivariant under Lorentz transformations, the symmetry group of relativistic kinematics. At the same time, the architecture is a Transformer, which makes it versatile and scalable to large systems. L-GATr is first demonstrated on regression and classification tasks from particle physics. We then construct the first Lorentz-equivariant generative model: a continuous normalizing flow based on an L-GATr network, trained with Riemannian flow matching. Across our experiments, L-GATr is on par with or outperforms strong domain-specific baselines.
Related papers
- A Lorentz-Equivariant Transformer for All of the LHC [5.329375781648604]
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider.
arXiv Detail & Related papers (2024-11-01T08:40:42Z) - Transolver: A Fast Transformer Solver for PDEs on General Geometries [66.82060415622871]
We present Transolver, which learns intrinsic physical states hidden behind discretized geometries.
By calculating attention to physics-aware tokens encoded from slices, Transovler can effectively capture intricate physical correlations.
Transolver achieves consistent state-of-the-art with 22% relative gain across six standard benchmarks and also excels in large-scale industrial simulations.
arXiv Detail & Related papers (2024-02-04T06:37:38Z) - Explainable Equivariant Neural Networks for Particle Physics: PELICAN [51.02649432050852]
PELICAN is a novel permutation equivariant and Lorentz invariant aggregator network.
We present a study of the PELICAN algorithm architecture in the context of both tagging (classification) and reconstructing (regression) Lorentz-boosted top quarks.
We extend the application of PELICAN to the tasks of identifying quark-initiated vs.gluon-initiated jets, and a multi-class identification across five separate target categories of jets.
arXiv Detail & Related papers (2023-07-31T09:08:40Z) - PELICAN: Permutation Equivariant and Lorentz Invariant or Covariant
Aggregator Network for Particle Physics [64.5726087590283]
We present a machine learning architecture that uses a set of inputs maximally reduced with respect to the full 6-dimensional Lorentz symmetry.
We show that the resulting network outperforms all existing competitors despite much lower model complexity.
arXiv Detail & Related papers (2022-11-01T13:36:50Z) - Clifford Neural Layers for PDE Modeling [61.07764203014727]
Partial differential equations (PDEs) see widespread use in sciences and engineering to describe simulation of physical processes as scalar and vector fields interacting and coevolving over time.
Current methods do not explicitly take into account the relationship between different fields and their internal components, which are often correlated.
This paper presents the first usage of such multivector representations together with Clifford convolutions and Clifford Fourier transforms in the context of deep learning.
The resulting Clifford neural layers are universally applicable and will find direct use in the areas of fluid dynamics, weather forecasting, and the modeling of physical systems in general.
arXiv Detail & Related papers (2022-09-08T17:35:30Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Scalars are universal: Gauge-equivariant machine learning, structured
like classical physics [0.0]
neural networks that respect the gauge symmetries -- or coordinate freedom -- of physical law.
We show that it is simple to parameterize universally approximating functions that are equivariant under these symmetries.
These results demonstrate theoretically that gauge-invariant deep learning models for classical physics with good scaling for large problems are feasible right now.
arXiv Detail & Related papers (2021-06-11T20:51:38Z) - Lorentz Group Equivariant Neural Network for Particle Physics [58.56031187968692]
We present a neural network architecture that is fully equivariant with respect to transformations under the Lorentz group.
For classification tasks in particle physics, we demonstrate that such an equivariant architecture leads to drastically simpler models that have relatively few learnable parameters.
arXiv Detail & Related papers (2020-06-08T17:54:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.