Dimensionless machine learning: Imposing exact units equivariance
- URL: http://arxiv.org/abs/2204.00887v1
- Date: Sat, 2 Apr 2022 15:46:20 GMT
- Title: Dimensionless machine learning: Imposing exact units equivariance
- Authors: Soledad Villar and Weichi Yao and David W. Hogg and Ben Blum-Smith and
Bianca Dumitrascu
- Abstract summary: We provide a two stage learning procedure for units-equivariant machine learning.
We first construct a dimensionless version of its inputs using classic results from dimensional analysis.
We then perform inference in the dimensionless space.
- Score: 7.9926585627926166
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Units equivariance is the exact symmetry that follows from the requirement
that relationships among measured quantities of physics relevance must obey
self-consistent dimensional scalings. Here, we employ dimensional analysis and
ideas from equivariant machine learning to provide a two stage learning
procedure for units-equivariant machine learning. For a given learning task, we
first construct a dimensionless version of its inputs using classic results
from dimensional analysis, and then perform inference in the dimensionless
space. Our approach can be used to impose units equivariance across a broad
range of machine learning methods which are equivariant to rotations and other
groups. We discuss the in-sample and out-of-sample prediction accuracy gains
one can obtain in contexts like symbolic regression and emulation, where
symmetry is important. We illustrate our approach with simple numerical
examples involving dynamical systems in physics and ecology.
Related papers
- Symmetry From Scratch: Group Equivariance as a Supervised Learning Task [1.8570740863168362]
In machine learning datasets with symmetries, the paradigm for backward compatibility with symmetry-breaking has been to relax equivariant architectural constraints.
We introduce symmetry-cloning, a method for inducing equivariance in machine learning models.
arXiv Detail & Related papers (2024-10-05T00:44:09Z) - Morphological Symmetries in Robotics [45.32599550966704]
morphological symmetries are intrinsic properties of the robot's morphology.
These symmetries extend to the robot's state space and sensor measurements.
For data-driven methods, we demonstrate that morphological symmetries can enhance the sample efficiency and generalization of machine learning models.
In the context of analytical methods, we employ abstract harmonic analysis to decompose the robot's dynamics into a superposition of lower-dimensional, independent dynamics.
arXiv Detail & Related papers (2024-02-23T17:21:21Z) - A Unified Framework to Enforce, Discover, and Promote Symmetry in Machine Learning [5.1105250336911405]
We provide a unifying theoretical and methodological framework for incorporating symmetry into machine learning models.
We show that enforcing and discovering symmetry are linear-algebraic tasks that are dual with respect to the bilinear structure of the Lie derivative.
We propose a novel way to promote symmetry by introducing a class of convex regularization functions based on the Lie derivative and nuclear norm relaxation.
arXiv Detail & Related papers (2023-11-01T01:19:54Z) - In-Context Convergence of Transformers [63.04956160537308]
We study the learning dynamics of a one-layer transformer with softmax attention trained via gradient descent.
For data with imbalanced features, we show that the learning dynamics take a stage-wise convergence process.
arXiv Detail & Related papers (2023-10-08T17:55:33Z) - EqMotion: Equivariant Multi-agent Motion Prediction with Invariant
Interaction Reasoning [83.11657818251447]
We propose EqMotion, an efficient equivariant motion prediction model with invariant interaction reasoning.
We conduct experiments for the proposed model on four distinct scenarios: particle dynamics, molecule dynamics, human skeleton motion prediction and pedestrian trajectory prediction.
Our method achieves state-of-the-art prediction performances on all the four tasks, improving by 24.0/30.1/8.6/9.2%.
arXiv Detail & Related papers (2023-03-20T05:23:46Z) - Sample Efficient Dynamics Learning for Symmetrical Legged
Robots:Leveraging Physics Invariance and Geometric Symmetries [14.848950116410231]
This paper proposes a novel approach for learning dynamics leveraging the symmetry in the underlying robotic system.
Existing frameworks that represent all data in vector space fail to consider the structured information of the robot.
arXiv Detail & Related papers (2022-10-13T19:57:46Z) - Symmetry Group Equivariant Architectures for Physics [52.784926970374556]
In the domain of machine learning, an awareness of symmetries has driven impressive performance breakthroughs.
We argue that both the physics community and the broader machine learning community have much to understand.
arXiv Detail & Related papers (2022-03-11T18:27:04Z) - Equivariant Deep Dynamical Model for Motion Prediction [0.0]
Deep generative modeling is a powerful approach for dynamical modeling to discover the most simplified and compressed underlying description of the data.
Most learning tasks have intrinsic symmetries, i.e., the input transformations leave the output unchanged, or the output undergoes a similar transformation.
We propose an SO(3) equivariant deep dynamical model (EqDDM) for motion prediction that learns a structured representation of the input space in the sense that the embedding varies with symmetry transformations.
arXiv Detail & Related papers (2021-11-02T21:01:43Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.