KEEC: Embed to Control on An Equivariant Geometry
- URL: http://arxiv.org/abs/2312.01544v2
- Date: Sun, 10 Dec 2023 11:11:49 GMT
- Title: KEEC: Embed to Control on An Equivariant Geometry
- Authors: Xiaoyuan Cheng, Yiming Yang, Wei Jiang, Yukun Hu
- Abstract summary: This paper investigates how representation learning can enable optimal control in unknown and complex dynamics.
Koopman Embed to Equivariant Control (KEEC) is proposed for model learning and control.
The effectiveness of KEEC is demonstrated in challenging dynamical systems, including chaotic ones like Lorenz-63.
- Score: 32.21549079265448
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper investigates how representation learning can enable optimal
control in unknown and complex dynamics, such as chaotic and non-linear
systems, without relying on prior domain knowledge of the dynamics. The core
idea is to establish an equivariant geometry that is diffeomorphic to the
manifold defined by a dynamical system and to perform optimal control within
this corresponding geometry, which is a non-trivial task. To address this
challenge, Koopman Embed to Equivariant Control (KEEC) is proposed for model
learning and control. Inspired by Lie theory, KEEC begins by learning a
non-linear dynamical system defined on a manifold and embedding trajectories
into a Lie group. Subsequently, KEEC formulates an equivariant value function
equation in reinforcement learning on the equivariant geometry, ensuring an
invariant effect as the value function on the original manifold. By deriving
analytical-form optimal actions on the equivariant value function, KEEC
theoretically achieves quadratic convergence for the optimal equivariant value
function by leveraging the differential information on the equivariant
geometry. The effectiveness of KEEC is demonstrated in challenging dynamical
systems, including chaotic ones like Lorenz-63. Notably, our results show that
isometric functions, which maintain the compactness and completeness of
geometry while preserving metric and differential information, consistently
outperform loss functions lacking these characteristics.
Related papers
- Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - Shape Arithmetic Expressions: Advancing Scientific Discovery Beyond Closed-Form Equations [56.78271181959529]
Generalized Additive Models (GAMs) can capture non-linear relationships between variables and targets, but they cannot capture intricate feature interactions.
We propose Shape Expressions Arithmetic ( SHAREs) that fuses GAM's flexible shape functions with the complex feature interactions found in mathematical expressions.
We also design a set of rules for constructing SHAREs that guarantee transparency of the found expressions beyond the standard constraints.
arXiv Detail & Related papers (2024-04-15T13:44:01Z) - Symmetry Preservation in Hamiltonian Systems: Simulation and Learning [0.9208007322096532]
This work presents a general geometric framework for simulating and learning the dynamics of Hamiltonian systems.
We propose to simulate and learn the mappings of interest through the construction of $G$-invariant Lagrangian submanifolds.
Our designs leverage pivotal techniques and concepts in symplectic geometry and geometric mechanics.
arXiv Detail & Related papers (2023-08-30T21:34:33Z) - Physics-Informed Quantum Machine Learning: Solving nonlinear
differential equations in latent spaces without costly grid evaluations [21.24186888129542]
We propose a physics-informed quantum algorithm to solve nonlinear and multidimensional differential equations.
By measuring the overlaps between states which are representations of DE terms, we construct a loss that does not require independent sequential function evaluations on grid points.
When the loss is trained variationally, our approach can be related to the differentiable quantum circuit protocol.
arXiv Detail & Related papers (2023-08-03T15:38:31Z) - Propagating Kernel Ambiguity Sets in Nonlinear Data-driven Dynamics
Models [3.743859059772078]
Given a nonlinear data-driven dynamical system model, how can one propagate the ambiguity sets forward for multiple steps?
This problem is the key to solving distributionally robust control and learning-based control of such learned system models under a data-distribution shift.
We propose an algorithm that exactly propagates ambiguity sets through nonlinear data-driven models using the Koopman operator and CME, via the kernel maximum mean discrepancy geometry.
arXiv Detail & Related papers (2023-04-27T09:38:49Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Geometric and Physical Quantities improve E(3) Equivariant Message
Passing [59.98327062664975]
We introduce Steerable E(3) Equivariant Graph Neural Networks (SEGNNs) that generalise equivariant graph networks.
This model, composed of steerables, is able to incorporate geometric and physical information in both the message and update functions.
We demonstrate the effectiveness of our method on several tasks in computational physics and chemistry.
arXiv Detail & Related papers (2021-10-06T16:34:26Z) - Functional Space Analysis of Local GAN Convergence [26.985600125290908]
We study the local dynamics of adversarial training in the general functional space.
We show how it can be represented as a system of partial differential equations.
Our perspective reveals several insights on the practical tricks commonly used to stabilize GANs.
arXiv Detail & Related papers (2021-02-08T18:59:46Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.