Equivariant Deep Dynamical Model for Motion Prediction
- URL: http://arxiv.org/abs/2111.01892v1
- Date: Tue, 2 Nov 2021 21:01:43 GMT
- Title: Equivariant Deep Dynamical Model for Motion Prediction
- Authors: Bahar Azari and Deniz Erdo\u{g}mu\c{s}
- Abstract summary: Deep generative modeling is a powerful approach for dynamical modeling to discover the most simplified and compressed underlying description of the data.
Most learning tasks have intrinsic symmetries, i.e., the input transformations leave the output unchanged, or the output undergoes a similar transformation.
We propose an SO(3) equivariant deep dynamical model (EqDDM) for motion prediction that learns a structured representation of the input space in the sense that the embedding varies with symmetry transformations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning representations through deep generative modeling is a powerful
approach for dynamical modeling to discover the most simplified and compressed
underlying description of the data, to then use it for other tasks such as
prediction. Most learning tasks have intrinsic symmetries, i.e., the input
transformations leave the output unchanged, or the output undergoes a similar
transformation. The learning process is, however, usually uninformed of these
symmetries. Therefore, the learned representations for individually transformed
inputs may not be meaningfully related. In this paper, we propose an SO(3)
equivariant deep dynamical model (EqDDM) for motion prediction that learns a
structured representation of the input space in the sense that the embedding
varies with symmetry transformations. EqDDM is equipped with equivariant
networks to parameterize the state-space emission and transition models. We
demonstrate the superior predictive performance of the proposed model on
various motion data.
Related papers
- Unsupervised Representation Learning from Sparse Transformation Analysis [79.94858534887801]
We propose to learn representations from sequence data by factorizing the transformations of the latent variables into sparse components.
Input data are first encoded as distributions of latent activations and subsequently transformed using a probability flow model.
arXiv Detail & Related papers (2024-10-07T23:53:25Z) - Latent State Models of Training Dynamics [51.88132043461152]
We train models with different random seeds and compute a variety of metrics throughout training.
We then fit a hidden Markov model (HMM) over the resulting sequences of metrics.
We use the HMM representation to study phase transitions and identify latent "detour" states that slow down convergence.
arXiv Detail & Related papers (2023-08-18T13:20:08Z) - EqMotion: Equivariant Multi-agent Motion Prediction with Invariant
Interaction Reasoning [83.11657818251447]
We propose EqMotion, an efficient equivariant motion prediction model with invariant interaction reasoning.
We conduct experiments for the proposed model on four distinct scenarios: particle dynamics, molecule dynamics, human skeleton motion prediction and pedestrian trajectory prediction.
Our method achieves state-of-the-art prediction performances on all the four tasks, improving by 24.0/30.1/8.6/9.2%.
arXiv Detail & Related papers (2023-03-20T05:23:46Z) - Learning Symmetric Embeddings for Equivariant World Models [9.781637768189158]
We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images)
This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation.
Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations.
arXiv Detail & Related papers (2022-04-24T22:31:52Z) - Dimensionless machine learning: Imposing exact units equivariance [7.9926585627926166]
We provide a two stage learning procedure for units-equivariant machine learning.
We first construct a dimensionless version of its inputs using classic results from dimensional analysis.
We then perform inference in the dimensionless space.
arXiv Detail & Related papers (2022-04-02T15:46:20Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z) - Incorporating Symmetry into Deep Dynamics Models for Improved
Generalization [24.363954435050264]
We propose to improve accuracy and generalization by incorporating symmetries into convolutional neural networks.
Our models are theoretically and experimentally robust to distributional shift by symmetry group transformations.
Compared with image or text applications, our work is a significant step towards applying equivariant neural networks to high-dimensional systems.
arXiv Detail & Related papers (2020-02-08T01:28:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.