Connecting Permutation Equivariant Neural Networks and Partition Diagrams
- URL: http://arxiv.org/abs/2212.08648v3
- Date: Thu, 8 Aug 2024 13:09:51 GMT
- Title: Connecting Permutation Equivariant Neural Networks and Partition Diagrams
- Authors: Edward Pearce-Crump,
- Abstract summary: We show that all of the weight matrices that appear in Permutation equivariant neural networks can be obtained from Schur-Weyl duality.
In particular, we adapt Schur-Weyl duality to derive a simple, diagrammatic method for calculating the weight matrices themselves.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Permutation equivariant neural networks are often constructed using tensor powers of $\mathbb{R}^{n}$ as their layer spaces. We show that all of the weight matrices that appear in these neural networks can be obtained from Schur-Weyl duality between the symmetric group and the partition algebra. In particular, we adapt Schur-Weyl duality to derive a simple, diagrammatic method for calculating the weight matrices themselves.
Related papers
- Monomial Matrix Group Equivariant Neural Functional Networks [1.797555376258229]
We extend the study of the group action on the network weights by incorporating scaling/sign-flipping symmetries.
We name our new family of NFNs the Monomial Matrix Group Equivariant Neural Functional Networks (Monomial-NFN)
arXiv Detail & Related papers (2024-09-18T04:36:05Z) - Geometrical aspects of lattice gauge equivariant convolutional neural
networks [0.0]
Lattice gauge equivariant convolutional neural networks (L-CNNs) are a framework for convolutional neural networks that can be applied to non-Abelian lattice gauge theories.
arXiv Detail & Related papers (2023-03-20T20:49:08Z) - Fast computation of permutation equivariant layers with the partition
algebra [0.0]
Linear neural network layers that are either equivariant or invariant to permutations of their inputs form core building blocks of modern deep learning architectures.
Examples include the layers of DeepSets, as well as linear layers occurring in attention blocks of transformers and some graph neural networks.
arXiv Detail & Related papers (2023-03-10T21:13:12Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - How Jellyfish Characterise Alternating Group Equivariant Neural Networks [0.0]
We find a basis for the learnable, linear, $A_n$-equivariant layer functions between such tensor power spaces in the standard basis of $mathbbRn$.
We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.
arXiv Detail & Related papers (2023-01-24T17:39:10Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - MDP Homomorphic Networks: Group Symmetries in Reinforcement Learning [90.20563679417567]
This paper introduces MDP homomorphic networks for deep reinforcement learning.
MDP homomorphic networks are neural networks that are equivariant under symmetries in the joint state-action space of an MDP.
We show that such networks converge faster than unstructured networks on CartPole, a grid world and Pong.
arXiv Detail & Related papers (2020-06-30T15:38:37Z) - The general theory of permutation equivarant neural networks and higher
order graph variational encoders [6.117371161379209]
We derive formulae for general permutation equivariant layers, including the case where the layer acts on matrices by permuting their rows and columns simultaneously.
This case arises naturally in graph learning and relation learning applications.
We present a second order graph variational encoder, and show that the latent distribution of equivariant generative models must be exchangeable.
arXiv Detail & Related papers (2020-04-08T13:29:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.