MatrixNet: Learning over symmetry groups using learned group representations
- URL: http://arxiv.org/abs/2501.09571v1
- Date: Thu, 16 Jan 2025 14:45:12 GMT
- Title: MatrixNet: Learning over symmetry groups using learned group representations
- Authors: Lucas Laird, Circe Hsu, Asilata Bapat, Robin Walters,
- Abstract summary: We propose MatrixNet, a neural network architecture that learns matrix representations of group element inputs instead of using predefined representations.
We show that MatrixNet respects group relations allowing generalization of group elements of greater word length than in the training set.
- Score: 13.19415425364914
- License:
- Abstract: Group theory has been used in machine learning to provide a theoretically grounded approach for incorporating known symmetry transformations in tasks from robotics to protein modeling. In these applications, equivariant neural networks use known symmetry groups with predefined representations to learn over geometric input data. We propose MatrixNet, a neural network architecture that learns matrix representations of group element inputs instead of using predefined representations. MatrixNet achieves higher sample efficiency and generalization over several standard baselines in prediction tasks over the several finite groups and the Artin braid group. We also show that MatrixNet respects group relations allowing generalization to group elements of greater word length than in the training set.
Related papers
- A Diagrammatic Approach to Improve Computational Efficiency in Group Equivariant Neural Networks [1.9643748953805935]
Group equivariant neural networks are growing in importance owing to their ability to generalise well in applications where the data has known underlying symmetries.
Recent characterisations of a class of these networks that use high-order tensor power spaces as their layers suggest that they have significant potential.
We present a fast matrix multiplication algorithm for any equivariant weight matrix that maps between tensor power layer spaces in these networks for four groups.
arXiv Detail & Related papers (2024-12-14T14:08:06Z) - Learning Symmetries via Weight-Sharing with Doubly Stochastic Tensors [46.59269589647962]
Group equivariance has emerged as a valuable inductive bias in deep learning.
Group equivariant methods require the groups of interest to be known beforehand.
We show that when the dataset exhibits strong symmetries, the permutation matrices will converge to regular group representations.
arXiv Detail & Related papers (2024-12-05T20:15:34Z) - Metric Learning for Clifford Group Equivariant Neural Networks [15.551447911164903]
Clifford Group Equivariant Neural Networks (CGENNs) leverage Clifford algebras and multivectors to ensure symmetry constraints in neural representations.
Previous works have restricted internal network representations to Euclidean or Minkowski (pseudo-)metrics, handpicked depending on the problem at hand.
We propose an alternative method that enables the metric to be learned in a data-driven fashion, allowing the CGENN network to learn more flexible representations.
arXiv Detail & Related papers (2024-07-13T15:41:14Z) - Learning Linear Groups in Neural Networks [9.667333420680448]
We present a neural network architecture, Linear Group Networks (LGNs), for learning linear groups acting on the weight space of neural networks.
LGNs learn groups without any supervision or knowledge of the hidden symmetries in the data.
We demonstrate that the linear group structure depends on both the data distribution and the considered task.
arXiv Detail & Related papers (2023-05-29T18:29:11Z) - Geometric Clifford Algebra Networks [53.456211342585824]
We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical systems.
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
arXiv Detail & Related papers (2023-02-13T18:48:33Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.