Geometric Principles for Machine Learning of Dynamical Systems
- URL: http://arxiv.org/abs/2502.13895v1
- Date: Wed, 19 Feb 2025 17:28:40 GMT
- Title: Geometric Principles for Machine Learning of Dynamical Systems
- Authors: Zack Xuereb Conti, David J Wagg, Nick Pepper,
- Abstract summary: This paper proposes leveraging structure-rich geometric spaces for machine learning to achieve structural generalization.
We illustrate this view through the machine learning of linear time-invariant dynamical systems.
- Score: 0.0
- License:
- Abstract: Mathematical descriptions of dynamical systems are deeply rooted in topological spaces defined by non-Euclidean geometry. This paper proposes leveraging structure-rich geometric spaces for machine learning to achieve structural generalization when modeling physical systems from data, in contrast to embedding physics bias within model-free architectures. We consider model generalization to be a function of symmetry, invariance and uniqueness, defined as a topological mapping from state space dynamics to the parameter space. We illustrate this view through the machine learning of linear time-invariant dynamical systems, whose dynamics reside on the symmetric positive definite manifold.
Related papers
- Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Dynamics Harmonic Analysis of Robotic Systems: Application in Data-Driven Koopman Modelling [24.738444847113232]
We introduce the use of harmonic analysis to decompose the state space of symmetric robotic systems into isotypic subspaces.
For linear dynamics, we characterize how this decomposition leads to a subdivision of the dynamics into independent linear systems on each subspace.
Our architecture, validated on synthetic systems and the dynamics of locomotion of a quadrupedal robot, exhibits enhanced generalization, sample efficiency, and interpretability.
arXiv Detail & Related papers (2023-12-12T17:34:42Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Symmetry Preservation in Hamiltonian Systems: Simulation and Learning [0.9208007322096532]
This work presents a general geometric framework for simulating and learning the dynamics of Hamiltonian systems.
We propose to simulate and learn the mappings of interest through the construction of $G$-invariant Lagrangian submanifolds.
Our designs leverage pivotal techniques and concepts in symplectic geometry and geometric mechanics.
arXiv Detail & Related papers (2023-08-30T21:34:33Z) - GD-VAEs: Geometric Dynamic Variational Autoencoders for Learning Nonlinear Dynamics and Dimension Reductions [0.0]
We develop data-driven methods to learn parsimonious representations of nonlinear dynamics from observations.
The approaches learn nonlinear state-space models of the dynamics for general manifold latent spaces.
Motivated by problems arising in parameterized PDEs and physics, we investigate the performance of our methods on tasks for learning reduced dimensional representations.
arXiv Detail & Related papers (2022-06-10T15:23:23Z) - Symmetry Group Equivariant Architectures for Physics [52.784926970374556]
In the domain of machine learning, an awareness of symmetries has driven impressive performance breakthroughs.
We argue that both the physics community and the broader machine learning community have much to understand.
arXiv Detail & Related papers (2022-03-11T18:27:04Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Variational Autoencoders for Learning Nonlinear Dynamics of Physical
Systems [0.0]
We develop data-driven methods for incorporating physical information for priors to learn parsimonious representations of nonlinear systems.
Our approach is based on Variational Autoencoders (VAEs) for learning from observations nonlinear state space models.
arXiv Detail & Related papers (2020-12-07T05:00:22Z) - OnsagerNet: Learning Stable and Interpretable Dynamics using a
Generalized Onsager Principle [19.13913681239968]
We learn stable and physically interpretable dynamical models using sampled trajectory data from physical processes based on a generalized Onsager principle.
We further apply this method to study Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous reduced order models.
arXiv Detail & Related papers (2020-09-06T07:30:59Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.