Learning functions on symmetric matrices and point clouds via lightweight invariant features
- URL: http://arxiv.org/abs/2405.08097v2
- Date: Wed, 15 May 2024 13:48:54 GMT
- Title: Learning functions on symmetric matrices and point clouds via lightweight invariant features
- Authors: Ben Blum-Smith, Ningyuan Huang, Marco Cuturi, Soledad Villar,
- Abstract summary: We present a formulation for machine learning of functions on symmetric matrices that are invariant with respect to the action of permutations.
We show that these invariant features can separate all distinct orbits of symmetric matrices except for a measure zero set.
For point clouds in a fixed dimension, we prove that the number of invariant features can be reduced, generically without losing expressivity.
- Score: 26.619014249559942
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we present a mathematical formulation for machine learning of (1) functions on symmetric matrices that are invariant with respect to the action of permutations by conjugation, and (2) functions on point clouds that are invariant with respect to rotations, reflections, and permutations of the points. To achieve this, we construct $O(n^2)$ invariant features derived from generators for the field of rational functions on $n\times n$ symmetric matrices that are invariant under joint permutations of rows and columns. We show that these invariant features can separate all distinct orbits of symmetric matrices except for a measure zero set; such features can be used to universally approximate invariant functions on almost all weighted graphs. For point clouds in a fixed dimension, we prove that the number of invariant features can be reduced, generically without losing expressivity, to $O(n)$, where $n$ is the number of points. We combine these invariant features with DeepSets to learn functions on symmetric matrices and point clouds with varying sizes. We empirically demonstrate the feasibility of our approach on molecule property regression and point cloud distance prediction.
Related papers
- Uniform $\mathcal{C}^k$ Approximation of $G$-Invariant and Antisymmetric
Functions, Embedding Dimensions, and Polynomial Representations [0.0]
We show that the embedding dimension required is independent of the regularity of the target function, the accuracy of the desired approximation, and $k$.
We also provide upper and lower bounds on $K$ and show that $K$ is independent of the regularity of the target function, the desired approximation accuracy, and $k$.
arXiv Detail & Related papers (2024-03-02T23:19:10Z) - Equivariant Manifold Neural ODEs and Differential Invariants [1.6073704837297416]
We develop a manifestly geometric framework for equivariant manifold neural ordinary differential equations (NODEs)
We use it to analyse their modelling capabilities for symmetric data.
arXiv Detail & Related papers (2024-01-25T12:23:22Z) - Geometry of Linear Neural Networks: Equivariance and Invariance under
Permutation Groups [0.0]
We investigate the subvariety of functions that are equivariant or invariant under the action of a permutation group.
We draw conclusions for the parameterization and the design of equivariant and invariant linear networks.
arXiv Detail & Related papers (2023-09-24T19:40:15Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Machine learning and invariant theory [10.178220223515956]
We introduce the topic and explain a couple of methods to explicitly parameterize equivariant functions.
We explicate a general procedure, attributed to Malgrange, to express all maps between linear spaces that are equivariant under the action of a group $G$.
arXiv Detail & Related papers (2022-09-29T17:52:17Z) - Towards Antisymmetric Neural Ansatz Separation [48.80300074254758]
We study separations between two fundamental models of antisymmetric functions, that is, functions $f$ of the form $f(x_sigma(1), ldots, x_sigma(N))
These arise in the context of quantum chemistry, and are the basic modeling tool for wavefunctions of Fermionic systems.
arXiv Detail & Related papers (2022-08-05T16:35:24Z) - Permutation symmetry in large N Matrix Quantum Mechanics and Partition
Algebras [0.0]
We describe the implications of permutation symmetry for the state space and dynamics of quantum mechanical systems of general size $N$.
A symmetry-based mechanism for quantum many body scars discussed in the literature can be realised in these matrix systems with permutation symmetry.
arXiv Detail & Related papers (2022-07-05T16:47:10Z) - $O(N^2)$ Universal Antisymmetry in Fermionic Neural Networks [107.86545461433616]
We propose permutation-equivariant architectures, on which a determinant Slater is applied to induce antisymmetry.
FermiNet is proved to have universal approximation capability with a single determinant, namely, it suffices to represent any antisymmetric function.
We substitute the Slater with a pairwise antisymmetry construction, which is easy to implement and can reduce the computational cost to $O(N2)$.
arXiv Detail & Related papers (2022-05-26T07:44:54Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.