Group invariant machine learning by fundamental domain projections
- URL: http://arxiv.org/abs/2202.02164v1
- Date: Fri, 4 Feb 2022 14:45:57 GMT
- Title: Group invariant machine learning by fundamental domain projections
- Authors: Benjamin Aslan, Daniel Platt, David Sheard
- Abstract summary: We approach the well-studied problem of supervised group invariant and equivariant machine learning from the point of view of geometric topology.
We propose a novel approach using a pre-processing step, which involves projecting the input data into a geometric space.
This new data can then be the input for an arbitrary machine learning model.
- Score: 0.5156484100374058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We approach the well-studied problem of supervised group invariant and
equivariant machine learning from the point of view of geometric topology. We
propose a novel approach using a pre-processing step, which involves projecting
the input data into a geometric space which parametrises the orbits of the
symmetry group. This new data can then be the input for an arbitrary machine
learning model (neural network, random forest, support-vector machine etc).
We give an algorithm to compute the geometric projection, which is efficient
to implement, and we illustrate our approach on some example machine learning
problems (including the well-studied problem of predicting Hodge numbers of
CICY matrices), in each case finding an improvement in accuracy versus others
in the literature. The geometric topology viewpoint also allows us to give a
unified description of so-called intrinsic approaches to group equivariant
machine learning, which encompasses many other approaches in the literature.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Symmetry From Scratch: Group Equivariance as a Supervised Learning Task [1.8570740863168362]
In machine learning datasets with symmetries, the paradigm for backward compatibility with symmetry-breaking has been to relax equivariant architectural constraints.
We introduce symmetry-cloning, a method for inducing equivariance in machine learning models.
arXiv Detail & Related papers (2024-10-05T00:44:09Z) - Disentangled Representation Learning with the Gromov-Monge Gap [65.73194652234848]
Learning disentangled representations from unlabelled data is a fundamental challenge in machine learning.
We introduce a novel approach to disentangled representation learning based on quadratic optimal transport.
We demonstrate the effectiveness of our approach for quantifying disentanglement across four standard benchmarks.
arXiv Detail & Related papers (2024-07-10T16:51:32Z) - Geometry of EM and related iterative algorithms [8.228889210180268]
The Expectation--Maximization (EM) algorithm is a simple meta-algorithm that has been used for many years as a methodology for statistical inference.
In this paper, we introduce the $em$ algorithm, an information geometric formulation of the EM algorithm, and its extensions and applications to various problems.
arXiv Detail & Related papers (2022-09-03T00:23:23Z) - Dimensionless machine learning: Imposing exact units equivariance [7.9926585627926166]
We provide a two stage learning procedure for units-equivariant machine learning.
We first construct a dimensionless version of its inputs using classic results from dimensional analysis.
We then perform inference in the dimensionless space.
arXiv Detail & Related papers (2022-04-02T15:46:20Z) - Geometric Methods for Sampling, Optimisation, Inference and Adaptive
Agents [102.42623636238399]
We identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making.
We derive algorithms that exploit these geometric structures to solve these problems efficiently.
arXiv Detail & Related papers (2022-03-20T16:23:17Z) - Imitation of Manipulation Skills Using Multiple Geometries [20.21868546298435]
We propose a learning approach to extract the optimal representation from a dictionary of coordinate systems to represent an observed movement.
We apply our approach to grasping and box opening tasks in simulation and on a 7-axis Franka Emika robot.
arXiv Detail & Related papers (2022-03-02T15:19:33Z) - GENEOnet: A new machine learning paradigm based on Group Equivariant
Non-Expansive Operators. An application to protein pocket detection [97.5153823429076]
We introduce a new computational paradigm based on Group Equivariant Non-Expansive Operators.
We test our method, called GENEOnet, on a key problem in drug design: detecting pockets on the surface of proteins that can host.
arXiv Detail & Related papers (2022-01-31T11:14:51Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Parallelized Computation and Backpropagation Under Angle-Parametrized
Orthogonal Matrices [0.0]
We show how an apparently sequential elementary rotation parametrization can be restructured into blocks of commutative operations.
We discuss parametric restrictions of interest to generative modeling and present promising performance results with a prototype GPU implementation.
arXiv Detail & Related papers (2021-05-30T00:47:03Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.