Permutation Equivariant Neural Networks for Symmetric Tensors
- URL: http://arxiv.org/abs/2503.11276v1
- Date: Fri, 14 Mar 2025 10:33:13 GMT
- Title: Permutation Equivariant Neural Networks for Symmetric Tensors
- Authors: Edward Pearce-Crump,
- Abstract summary: We present two different characterisations of all linear permutation equivariant functions between symmetric power spaces of $mathbbRn$.<n>We show that these functions are highly efficient compared to standard tensors and have potential to generalise well to symmetrics of different sizes.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Incorporating permutation equivariance into neural networks has proven to be useful in ensuring that models respect symmetries that exist in data. Symmetric tensors, which naturally appear in statistics, machine learning, and graph theory, are essential for many applications in physics, chemistry, and materials science, amongst others. However, existing research on permutation equivariant models has not explored symmetric tensors as inputs, and most prior work on learning from these tensors has focused on equivariance to Euclidean groups. In this paper, we present two different characterisations of all linear permutation equivariant functions between symmetric power spaces of $\mathbb{R}^n$. We show on two tasks that these functions are highly data efficient compared to standard MLPs and have potential to generalise well to symmetric tensors of different sizes.
Related papers
- Improving Equivariant Networks with Probabilistic Symmetry Breaking [9.164167226137664]
Equivariant networks encode known symmetries into neural networks, often enhancing generalizations.
This poses an important problem, both (1) for prediction tasks on domains where self-symmetries are common, and (2) for generative models, which must break symmetries in order to reconstruct from highly symmetric latent spaces.
We present novel theoretical results that establish sufficient conditions for representing such distributions.
arXiv Detail & Related papers (2025-03-27T21:04:49Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - A Galois theorem for machine learning: Functions on symmetric matrices and point clouds via lightweight invariant features [26.619014249559942]
We present a mathematical formulation for machine learning of functions on symmetric matrices and point clouds.<n>We provide a general construction of generically separating invariant features using ideas inspired by Galois theory.<n>We prove that the number of invariant features can be reduced, generically without losing expressivity, to $O(n)$, where $n$ is the number of points.
arXiv Detail & Related papers (2024-05-13T18:24:03Z) - Symmetry Breaking and Equivariant Neural Networks [17.740760773905986]
We introduce a novel notion of'relaxed equiinjection'
We show how to incorporate this relaxation into equivariant multilayer perceptronrons (E-MLPs)
The relevance of symmetry breaking is then discussed in various application domains.
arXiv Detail & Related papers (2023-12-14T15:06:48Z) - Approximately Equivariant Graph Networks [14.312312714312046]
Graph neural networks (GNNs) are commonly described as being permutation equivariant with respect to node relabeling in the graph.
We focus on the active symmetries of GNNs, by considering a learning setting where signals are supported on a fixed graph.
arXiv Detail & Related papers (2023-08-21T03:13:38Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Generative Adversarial Symmetry Discovery [19.098785309131458]
LieGAN represents symmetry as interpretable Lie algebra basis and can discover various symmetries.
The learned symmetry can also be readily used in several existing equivariant neural networks to improve accuracy and generalization in prediction.
arXiv Detail & Related papers (2023-02-01T04:28:36Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Connecting Permutation Equivariant Neural Networks and Partition Diagrams [0.0]
We show that all of the weight matrices that appear in Permutation equivariant neural networks can be obtained from Schur-Weyl duality.
In particular, we adapt Schur-Weyl duality to derive a simple, diagrammatic method for calculating the weight matrices themselves.
arXiv Detail & Related papers (2022-12-16T18:48:54Z) - Unifying O(3) Equivariant Neural Networks Design with Tensor-Network Formalism [12.008737454250463]
We propose using fusion diagrams, a technique widely employed in simulating SU($2$)-symmetric quantum many-body problems, to design new equivariant components for equivariant neural networks.
When applied to particles within a given local neighborhood, the resulting components, which we term "fusion blocks," serve as universal approximators of any continuous equivariant function.
Our approach, which combines tensor networks with equivariant neural networks, suggests a potentially fruitful direction for designing more expressive equivariant neural networks.
arXiv Detail & Related papers (2022-11-14T16:06:59Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.