Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems
- URL: http://arxiv.org/abs/2005.13143v2
- Date: Mon, 21 Sep 2020 17:28:20 GMT
- Title: Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems
- Authors: Muhammad Asif Rana, Anqi Li, Dieter Fox, Byron Boots, Fabio Ramos,
Nathan Ratliff
- Abstract summary: We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
- Score: 74.80320120264459
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robotic tasks often require motions with complex geometric structures. We
present an approach to learn such motions from a limited number of human
demonstrations by exploiting the regularity properties of human motions e.g.
stability, smoothness, and boundedness. The complex motions are encoded as
rollouts of a stable dynamical system, which, under a change of coordinates
defined by a diffeomorphism, is equivalent to a simple, hand-specified
dynamical system. As an immediate result of using diffeomorphisms, the
stability property of the hand-specified dynamical system directly carry over
to the learned dynamical system. Inspired by recent works in density
estimation, we propose to represent the diffeomorphism as a composition of
simple parameterized diffeomorphisms. Additional structure is imposed to
provide guarantees on the smoothness of the generated motions. The efficacy of
this approach is demonstrated through validation on an established benchmark as
well demonstrations collected on a real-world robotic system.
Related papers
- Identifiable Representation and Model Learning for Latent Dynamic Systems [0.0]
We study the problem of identifiable representation and model learning for latent dynamic systems.
We prove that, for linear or affine nonlinear latent dynamic systems, it is possible to identify the representations up to scaling and determine the models up to some simple transformations.
arXiv Detail & Related papers (2024-10-23T13:55:42Z) - Data-driven ODE modeling of the high-frequency complex dynamics of a fluid flow [0.0]
We propose a novel method of modeling such dynamics, including the high-frequency intermittent behavior of a fluid flow.
We construct an autonomous joint model composed of two parts: the first is an autonomous system of a base variable, and the other concerns the targeted variable being affected by a term.
The constructed joint model succeeded in not only inferring a short trajectory but also reconstructing chaotic sets and statistical properties obtained from a long trajectory.
arXiv Detail & Related papers (2024-09-01T09:06:32Z) - AI-Lorenz: A physics-data-driven framework for black-box and gray-box
identification of chaotic systems with symbolic regression [2.07180164747172]
We develop a framework that learns mathematical expressions modeling complex dynamical behaviors.
We train a small neural network to learn the dynamics of a system, its rate of change in time, and missing model terms.
This, in turn, enables us to predict the future evolution of the dynamical behavior.
arXiv Detail & Related papers (2023-12-21T18:58:41Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Evolve Smoothly, Fit Consistently: Learning Smooth Latent Dynamics For
Advection-Dominated Systems [14.553972457854517]
We present a data-driven, space-time continuous framework to learn surrogatemodels for complex physical systems.
We leverage the expressive power of the network and aspecially designed consistency-inducing regularization to obtain latent trajectories that are both low-dimensional and smooth.
arXiv Detail & Related papers (2023-01-25T03:06:03Z) - Learning Riemannian Stable Dynamical Systems via Diffeomorphisms [0.23204178451683263]
Dexterous and autonomous robots should be capable of executing elaborated dynamical motions skillfully.
Learning techniques may be leveraged to build models of such dynamic skills.
To accomplish this, the learning model needs to encode a stable vector field that resembles the desired motion dynamics.
arXiv Detail & Related papers (2022-11-06T16:28:45Z) - MoDi: Unconditional Motion Synthesis from Diverse Data [51.676055380546494]
We present MoDi, an unconditional generative model that synthesizes diverse motions.
Our model is trained in a completely unsupervised setting from a diverse, unstructured and unlabeled motion dataset.
We show that despite the lack of any structure in the dataset, the latent space can be semantically clustered.
arXiv Detail & Related papers (2022-06-16T09:06:25Z) - Differentiable Simulation of Soft Multi-body Systems [99.4302215142673]
We develop a top-down matrix assembly algorithm within Projective Dynamics.
We derive a differentiable control framework for soft articulated bodies driven by muscles, joint torques, or pneumatic tubes.
arXiv Detail & Related papers (2022-05-03T20:03:22Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.