Latent Mixture of Symmetries for Sample-Efficient Dynamic Learning
- URL: http://arxiv.org/abs/2510.03578v1
- Date: Sat, 04 Oct 2025 00:06:31 GMT
- Title: Latent Mixture of Symmetries for Sample-Efficient Dynamic Learning
- Authors: Haoran Li, Chenhan Xiao, Muhao Guo, Yang Weng,
- Abstract summary: Learning dynamics is essential for model-based control and Reinforcement Learning in engineering systems.<n>We propose the Latent Mixture of Symmetries (Latent MoS), an expressive model that captures a mixture of symmetry-governed latent factors from complex dynamical measurements.
- Score: 7.722898209589864
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning dynamics is essential for model-based control and Reinforcement Learning in engineering systems, such as robotics and power systems. However, limited system measurements, such as those from low-resolution sensors, demand sample-efficient learning. Symmetry provides a powerful inductive bias by characterizing equivariant relations in system states to improve sample efficiency. While recent methods attempt to discover symmetries from data, they typically assume a single global symmetry group and treat symmetry discovery and dynamic learning as separate tasks, leading to limited expressiveness and error accumulation. In this paper, we propose the Latent Mixture of Symmetries (Latent MoS), an expressive model that captures a mixture of symmetry-governed latent factors from complex dynamical measurements. Latent MoS focuses on dynamic learning while locally and provably preserving the underlying symmetric transformations. To further capture long-term equivariance, we introduce a hierarchical architecture that stacks MoS blocks. Numerical experiments in diverse physical systems demonstrate that Latent MoS outperforms state-of-the-art baselines in interpolation and extrapolation tasks while offering interpretable latent representations suitable for future geometric and safety-critical analyses.
Related papers
- Equivariant Evidential Deep Learning for Interatomic Potentials [55.6997213490859]
Uncertainty quantification is critical for assessing the reliability of machine learning interatomic potentials in molecular dynamics simulations.<n>Existing UQ approaches for MLIPs are often limited by high computational cost or suboptimal performance.<n>We propose textitEquivariant Evidential Deep Learning for Interatomic Potentials ($texte2$IP), a backbone-agnostic framework that models atomic forces and their uncertainty jointly.
arXiv Detail & Related papers (2026-02-11T02:00:25Z) - Disordered Dynamics in High Dimensions: Connections to Random Matrices and Machine Learning [52.26396748560348]
We provide an overview of high dimensional dynamical systems driven by random matrices.<n>We focus on applications to simple models of learning and generalization in machine learning theory.
arXiv Detail & Related papers (2026-01-03T00:12:32Z) - SEAL - A Symmetry EncourAging Loss for High Energy Physics [0.005211875900848231]
Building machine learning models that explicitly respect symmetries can be difficult due to the dedicated components required.<n>We introduce soft constraints that allow the model to decide the importance of added symmetries during the learning process instead of enforcing exact symmetries.
arXiv Detail & Related papers (2025-11-03T19:00:13Z) - A Bayesian Framework for Symmetry Inference in Chaotic Attractors [0.0]
We present a framework that formulates symmetry detection as probabilistic model selection over a lattice of candidate subgroups.<n>An application to human gait dynamics reveals symmetry changes induced by mechanical constraints.
arXiv Detail & Related papers (2025-10-18T13:49:35Z) - Symmetries-enhanced Multi-Agent Reinforcement Learning [25.383183391244373]
Multi-agent reinforcement learning has emerged as a powerful framework for enabling agents to learn complex, coordinated behaviors.<n>Recent advancements have sought to alleviate those issues by embedding intrinsic symmetries of the systems in the policy.<n>This paper presents a novel framework for embedding extrinsic symmetries in multi-agent system dynamics.
arXiv Detail & Related papers (2025-01-02T08:41:31Z) - Exploiting Symmetry in Dynamics for Model-Based Reinforcement Learning with Asymmetric Rewards [0.6612847014373572]
We introduce a technique for learning dynamics that, by construction, exhibit specified symmetries.
Numerical experiments demonstrate that the proposed method learns a more accurate dynamical model.
arXiv Detail & Related papers (2024-03-27T21:31:46Z) - Learning Multiscale Consistency for Self-supervised Electron Microscopy
Instance Segmentation [48.267001230607306]
We propose a pretraining framework that enhances multiscale consistency in EM volumes.
Our approach leverages a Siamese network architecture, integrating strong and weak data augmentations.
It effectively captures voxel and feature consistency, showing promise for learning transferable representations for EM analysis.
arXiv Detail & Related papers (2023-08-19T05:49:13Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - On discrete symmetries of robotics systems: A group-theoretic and
data-driven analysis [38.92081817503126]
We study discrete morphological symmetries of dynamical systems.
These symmetries arise from the presence of one or more planes/axis of symmetry in the system's morphology.
We exploit these symmetries using data augmentation and $G$-equivariant neural networks.
arXiv Detail & Related papers (2023-02-21T04:10:16Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.