Lorentz Group Equivariant Neural Network for Particle Physics
- URL: http://arxiv.org/abs/2006.04780v1
- Date: Mon, 8 Jun 2020 17:54:43 GMT
- Title: Lorentz Group Equivariant Neural Network for Particle Physics
- Authors: Alexander Bogatskiy, Brandon Anderson, Jan T. Offermann, Marwah
Roussi, David W. Miller, Risi Kondor
- Abstract summary: We present a neural network architecture that is fully equivariant with respect to transformations under the Lorentz group.
For classification tasks in particle physics, we demonstrate that such an equivariant architecture leads to drastically simpler models that have relatively few learnable parameters.
- Score: 58.56031187968692
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a neural network architecture that is fully equivariant with
respect to transformations under the Lorentz group, a fundamental symmetry of
space and time in physics. The architecture is based on the theory of the
finite-dimensional representations of the Lorentz group and the equivariant
nonlinearity involves the tensor product. For classification tasks in particle
physics, we demonstrate that such an equivariant architecture leads to
drastically simpler models that have relatively few learnable parameters and
are much more physically interpretable than leading approaches that use CNNs
and point cloud approaches. The competitive performance of the network is
demonstrated on a public classification dataset [27] for tagging top quark
decays given energy-momenta of jet constituents produced in proton-proton
collisions.
Related papers
- Lie-Equivariant Quantum Graph Neural Networks [4.051777802443125]
binary classification tasks are ubiquitous in analyses of the vast amounts of LHC data.
We develop a Lie-Equivariant Quantum Graph Neural Network (Lie-EQGNN), a quantum model that is not only data efficient, but also has symmetry-preserving properties.
arXiv Detail & Related papers (2024-11-22T19:15:13Z) - Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics [4.4970885242855845]
Lorentz Geometric Algebra Transformer (L-GATr) is a new multi-purpose architecture for high-energy physics.
L-GATr is first demonstrated on regression and classification tasks from particle physics.
We then construct the first Lorentz-equivariant generative model: a continuous normalizing flow based on an L-GATr network.
arXiv Detail & Related papers (2024-05-23T17:15:41Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Explainable Equivariant Neural Networks for Particle Physics: PELICAN [51.02649432050852]
PELICAN is a novel permutation equivariant and Lorentz invariant aggregator network.
We present a study of the PELICAN algorithm architecture in the context of both tagging (classification) and reconstructing (regression) Lorentz-boosted top quarks.
We extend the application of PELICAN to the tasks of identifying quark-initiated vs.gluon-initiated jets, and a multi-class identification across five separate target categories of jets.
arXiv Detail & Related papers (2023-07-31T09:08:40Z) - PELICAN: Permutation Equivariant and Lorentz Invariant or Covariant
Aggregator Network for Particle Physics [64.5726087590283]
We present a machine learning architecture that uses a set of inputs maximally reduced with respect to the full 6-dimensional Lorentz symmetry.
We show that the resulting network outperforms all existing competitors despite much lower model complexity.
arXiv Detail & Related papers (2022-11-01T13:36:50Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Generalization capabilities of neural networks in lattice applications [0.0]
We investigate the advantages of adopting translationally equivariant neural networks in favor of non-equivariant ones.
We show that our best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts.
arXiv Detail & Related papers (2021-12-23T11:48:06Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.