Preserving gauge invariance in neural networks
- URL: http://arxiv.org/abs/2112.11239v1
- Date: Tue, 21 Dec 2021 14:08:12 GMT
- Title: Preserving gauge invariance in neural networks
- Authors: Matteo Favoni, Andreas Ipp, David I. M\"uller, Daniel Schuh
- Abstract summary: lattice gauge equivariant convolutional neural networks (L-CNNs)
We show how L-CNNs can represent a large class of gauge invariant and equivariant functions on the lattice.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In these proceedings we present lattice gauge equivariant convolutional
neural networks (L-CNNs) which are able to process data from lattice gauge
theory simulations while exactly preserving gauge symmetry. We review aspects
of the architecture and show how L-CNNs can represent a large class of gauge
invariant and equivariant functions on the lattice. We compare the performance
of L-CNNs and non-equivariant networks using a non-linear regression problem
and demonstrate how gauge invariance is broken for non-equivariant models.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - Geometrical aspects of lattice gauge equivariant convolutional neural
networks [0.0]
Lattice gauge equivariant convolutional neural networks (L-CNNs) are a framework for convolutional neural networks that can be applied to non-Abelian lattice gauge theories.
arXiv Detail & Related papers (2023-03-20T20:49:08Z) - Applications of Lattice Gauge Equivariant Neural Networks [0.0]
Lattice Gauge Equivariant Convolutional Neural Networks (L-CNNs)
L-CNNs can generalize better to differently sized lattices than traditional neural networks.
We present our progress on possible applications of L-CNNs to Wilson flow or continuous normalizing flow.
arXiv Detail & Related papers (2022-12-01T19:32:42Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Revisiting Transformation Invariant Geometric Deep Learning: Are Initial
Representations All You Need? [80.86819657126041]
We show that transformation-invariant and distance-preserving initial representations are sufficient to achieve transformation invariance.
Specifically, we realize transformation-invariant and distance-preserving initial point representations by modifying multi-dimensional scaling.
We prove that TinvNN can strictly guarantee transformation invariance, being general and flexible enough to be combined with the existing neural networks.
arXiv Detail & Related papers (2021-12-23T03:52:33Z) - Lattice gauge symmetry in neural networks [0.0]
We review a novel neural network architecture called lattice gauge equivariant convolutional neural networks (L-CNNs)
We discuss the concept of gauge equivariance which we use to explicitly construct a gauge equivariant convolutional layer and a bilinear layer.
The performance of L-CNNs and non-equivariant CNNs is compared using seemingly simple non-linear regression tasks.
arXiv Detail & Related papers (2021-11-08T11:20:11Z) - Encoding Involutory Invariance in Neural Networks [1.6371837018687636]
In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries.
In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity.
Numerical experiments indicate that the proposed models outperform baseline networks while respecting the imposed symmetry.
An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.
arXiv Detail & Related papers (2021-06-07T16:07:15Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - Lattice gauge equivariant convolutional neural networks [0.0]
We propose Lattice gauge equivariant Convolutional Neural Networks (L-CNNs) for generic machine learning applications.
We show that L-CNNs can learn and generalize gauge invariant quantities that traditional convolutional neural networks are incapable of finding.
arXiv Detail & Related papers (2020-12-23T19:00:01Z) - Learning Invariances in Neural Networks [51.20867785006147]
We show how to parameterize a distribution over augmentations and optimize the training loss simultaneously with respect to the network parameters and augmentation parameters.
We can recover the correct set and extent of invariances on image classification, regression, segmentation, and molecular property prediction from a large space of augmentations.
arXiv Detail & Related papers (2020-10-22T17:18:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.