Symmetry-via-Duality: Invariant Neural Network Densities from
Parameter-Space Correlators
- URL: http://arxiv.org/abs/2106.00694v1
- Date: Tue, 1 Jun 2021 18:00:06 GMT
- Title: Symmetry-via-Duality: Invariant Neural Network Densities from
Parameter-Space Correlators
- Authors: Anindita Maiti, Keegan Stoner, James Halverson
- Abstract summary: symmetries of network densities may be determined via dual computations of network correlation functions.
We demonstrate that the amount of symmetry in the initial density affects the accuracy of networks trained on Fashion-MNIST.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Parameter-space and function-space provide two different duality frames in
which to study neural networks. We demonstrate that symmetries of network
densities may be determined via dual computations of network correlation
functions, even when the density is unknown and the network is not equivariant.
Symmetry-via-duality relies on invariance properties of the correlation
functions, which stem from the choice of network parameter distributions. Input
and output symmetries of neural network densities are determined, which recover
known Gaussian process results in the infinite width limit. The mechanism may
also be utilized to determine symmetries during training, when parameters are
correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate
that the amount of symmetry in the initialization density affects the accuracy
of networks trained on Fashion-MNIST, and that symmetry breaking helps only
when it is in the direction of ground truth.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - The Empirical Impact of Neural Parameter Symmetries, or Lack Thereof [50.49582712378289]
We investigate the impact of neural parameter symmetries by introducing new neural network architectures.
We develop two methods, with some provable guarantees, of modifying standard neural networks to reduce parameter space symmetries.
Our experiments reveal several interesting observations on the empirical impact of parameter symmetries.
arXiv Detail & Related papers (2024-05-30T16:32:31Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Lie Point Symmetry and Physics Informed Networks [59.56218517113066]
We propose a loss function that informs the network about Lie point symmetries in the same way that PINN models try to enforce the underlying PDE through a loss function.
Our symmetry loss ensures that the infinitesimal generators of the Lie group conserve the PDE solutions.
Empirical evaluations indicate that the inductive bias introduced by the Lie point symmetries of the PDEs greatly boosts the sample efficiency of PINNs.
arXiv Detail & Related papers (2023-11-07T19:07:16Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - Hidden symmetries of ReLU networks [17.332539115959708]
In some networks, the only symmetries are permutation of neurons in a layer and positive scaling of parameters at a neuron, while other networks admit additional hidden symmetries.
In this work, we prove that, for any network architecture where no layer is narrower than the input, there exist parameter settings with no hidden symmetries.
arXiv Detail & Related papers (2023-06-09T18:07:06Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Approximately Equivariant Networks for Imperfectly Symmetric Dynamics [24.363954435050264]
We find that our models can outperform both baselines with no symmetry bias and baselines with overly strict symmetry in both simulated turbulence domains and real-world multi-stream jet flow.
arXiv Detail & Related papers (2022-01-28T07:31:28Z) - Encoding Involutory Invariance in Neural Networks [1.6371837018687636]
In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries.
In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity.
Numerical experiments indicate that the proposed models outperform baseline networks while respecting the imposed symmetry.
An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.
arXiv Detail & Related papers (2021-06-07T16:07:15Z) - Detecting Symmetries with Neural Networks [0.0]
We make extensive use of the structure in the embedding layer of the neural network.
We identify whether a symmetry is present and to identify orbits of the symmetry in the input.
For this example we present a novel data representation in terms of graphs.
arXiv Detail & Related papers (2020-03-30T17:58:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.