Approximately Equivariant Networks for Imperfectly Symmetric Dynamics
- URL: http://arxiv.org/abs/2201.11969v1
- Date: Fri, 28 Jan 2022 07:31:28 GMT
- Title: Approximately Equivariant Networks for Imperfectly Symmetric Dynamics
- Authors: Rui Wang, Robin Walters, Rose Yu
- Abstract summary: We find that our models can outperform both baselines with no symmetry bias and baselines with overly strict symmetry in both simulated turbulence domains and real-world multi-stream jet flow.
- Score: 24.363954435050264
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Incorporating symmetry as an inductive bias into neural network architecture
has led to improvements in generalization, data efficiency, and physical
consistency in dynamics modeling. Methods such as CNN or equivariant neural
networks use weight tying to enforce symmetries such as shift invariance or
rotational equivariance. However, despite the fact that physical laws obey many
symmetries, real-world dynamical data rarely conforms to strict mathematical
symmetry either due to noisy or incomplete data or to symmetry breaking
features in the underlying dynamical system. We explore approximately
equivariant networks which are biased towards preserving symmetry but are not
strictly constrained to do so. By relaxing equivariance constraints, we find
that our models can outperform both baselines with no symmetry bias and
baselines with overly strict symmetry in both simulated turbulence domains and
real-world multi-stream jet flow.
Related papers
- The Empirical Impact of Neural Parameter Symmetries, or Lack Thereof [50.49582712378289]
We investigate the impact of neural parameter symmetries by introducing new neural network architectures.
We develop two methods, with some provable guarantees, of modifying standard neural networks to reduce parameter space symmetries.
Our experiments reveal several interesting observations on the empirical impact of parameter symmetries.
arXiv Detail & Related papers (2024-05-30T16:32:31Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Symmetry Breaking and Equivariant Neural Networks [17.740760773905986]
We introduce a novel notion of'relaxed equiinjection'
We show how to incorporate this relaxation into equivariant multilayer perceptronrons (E-MLPs)
The relevance of symmetry breaking is then discussed in various application domains.
arXiv Detail & Related papers (2023-12-14T15:06:48Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - On discrete symmetries of robotics systems: A group-theoretic and
data-driven analysis [38.92081817503126]
We study discrete morphological symmetries of dynamical systems.
These symmetries arise from the presence of one or more planes/axis of symmetry in the system's morphology.
We exploit these symmetries using data augmentation and $G$-equivariant neural networks.
arXiv Detail & Related papers (2023-02-21T04:10:16Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - The Surprising Effectiveness of Equivariant Models in Domains with
Latent Symmetry [6.716931832076628]
We show that imposing symmetry constraints that do not exactly match the domain symmetry is very helpful in learning the true symmetry in the environment.
We demonstrate that an equivariant model can significantly outperform non-equivariant methods on domains with latent symmetries both in supervised learning and in reinforcement learning for robotic manipulation and control problems.
arXiv Detail & Related papers (2022-11-16T21:51:55Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Exact solutions of interacting dissipative systems via weak symmetries [77.34726150561087]
We analytically diagonalize the Liouvillian of a class Markovian dissipative systems with arbitrary strong interactions or nonlinearity.
This enables an exact description of the full dynamics and dissipative spectrum.
Our method is applicable to a variety of other systems, and could provide a powerful new tool for the study of complex driven-dissipative quantum systems.
arXiv Detail & Related papers (2021-09-27T17:45:42Z) - Encoding Involutory Invariance in Neural Networks [1.6371837018687636]
In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries.
In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity.
Numerical experiments indicate that the proposed models outperform baseline networks while respecting the imposed symmetry.
An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.
arXiv Detail & Related papers (2021-06-07T16:07:15Z) - Incorporating Symmetry into Deep Dynamics Models for Improved
Generalization [24.363954435050264]
We propose to improve accuracy and generalization by incorporating symmetries into convolutional neural networks.
Our models are theoretically and experimentally robust to distributional shift by symmetry group transformations.
Compared with image or text applications, our work is a significant step towards applying equivariant neural networks to high-dimensional systems.
arXiv Detail & Related papers (2020-02-08T01:28:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.