Symmetry-Aware Autoencoders: s-PCA and s-nlPCA
- URL: http://arxiv.org/abs/2111.02893v1
- Date: Thu, 4 Nov 2021 14:22:19 GMT
- Title: Symmetry-Aware Autoencoders: s-PCA and s-nlPCA
- Authors: Simon Kneer, Taraneh Sayadi, Denis Sipp, Peter Schmid, Georgios Rigas
- Abstract summary: We introduce a novel machine learning embedding in the autoencoder, which uses spatial transformer networks and Siamese networks to account for continuous and discrete symmetries.
The proposed symmetry-aware autoencoder is invariant to predetermined input transformations dictating the dynamics of the underlying physical system.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Nonlinear principal component analysis (nlPCA) via autoencoders has attracted
attention in the dynamical systems community due to its larger compression rate
when compared to linear principal component analysis (PCA). These model
reduction methods experience an increase in the dimensionality of the latent
space when applied to datasets that exhibit globally invariant samples due to
the presence of symmetries. In this study, we introduce a novel machine
learning embedding in the autoencoder, which uses spatial transformer networks
and Siamese networks to account for continuous and discrete symmetries,
respectively. The spatial transformer network discovers the optimal shift for
the continuous translation or rotation so that invariant samples are aligned in
the periodic directions. Similarly, the Siamese networks collapse samples that
are invariant under discrete shifts and reflections. Thus, the proposed
symmetry-aware autoencoder is invariant to predetermined input transformations
dictating the dynamics of the underlying physical system. This embedding can be
employed with both linear and nonlinear reduction methods, which we term
symmetry-aware PCA (s-PCA) and symmetry-aware nlPCA (s-nlPCA). We apply the
proposed framework to 3 fluid flow problems: Burgers' equation, the simulation
of the flow through a step diffuser and the Kolmogorov flow to showcase the
capabilities for cases exhibiting only continuous symmetries, only discrete
symmetries or a combination of both.
Related papers
- Group Crosscoders for Mechanistic Analysis of Symmetry [0.0]
Group crosscoders systematically discover and analyse symmetrical features in neural networks.
We show that group crosscoders can provide systematic insights into how neural networks represent symmetry.
arXiv Detail & Related papers (2024-10-31T17:47:01Z) - EqNIO: Subequivariant Neural Inertial Odometry [33.96552018734359]
We show that IMU data transforms equivariantly, when rotated around the gravity vector and reflected with respect to arbitrary planes parallel to gravity.
We then map the IMU data into this frame, thereby achieving an invariant canonicalization that can be directly used with off-the-shelf inertial odometry networks.
arXiv Detail & Related papers (2024-08-12T17:42:46Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Identifying Systems with Symmetries using Equivariant Autoregressive
Reservoir Computers [0.0]
Investigation focuses on identifying systems with symmetries using equivariant autoregressive reservoir computers.
General results in structured matrix approximation theory are presented, exploring a two-fold approach.
arXiv Detail & Related papers (2023-11-16T02:32:26Z) - Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Learning Symmetric Embeddings for Equivariant World Models [9.781637768189158]
We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images)
This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation.
Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations.
arXiv Detail & Related papers (2022-04-24T22:31:52Z) - Approximately Equivariant Networks for Imperfectly Symmetric Dynamics [24.363954435050264]
We find that our models can outperform both baselines with no symmetry bias and baselines with overly strict symmetry in both simulated turbulence domains and real-world multi-stream jet flow.
arXiv Detail & Related papers (2022-01-28T07:31:28Z) - Sampling asymmetric open quantum systems for artificial neural networks [77.34726150561087]
We present a hybrid sampling strategy which takes asymmetric properties explicitly into account, achieving fast convergence times and high scalability for asymmetric open systems.
We highlight the universal applicability of artificial neural networks, underlining the universal applicability of neural networks.
arXiv Detail & Related papers (2020-12-20T18:25:29Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.