SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred
from Vision
- URL: http://arxiv.org/abs/2111.05986v1
- Date: Wed, 10 Nov 2021 23:26:58 GMT
- Title: SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred
from Vision
- Authors: Irina Higgins, Peter Wirnsberger, Andrew Jaegle, Aleksandar Botev
- Abstract summary: A recently proposed class of models attempts to learn latent dynamics from high-dimensional observations.
Existing methods rely on image reconstruction quality, which does not always reflect the quality of the learnt latent dynamics.
We develop a set of new measures, including a binary indicator of whether the underlying Hamiltonian dynamics have been faithfully captured.
- Score: 73.26414295633846
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A recently proposed class of models attempts to learn latent dynamics from
high-dimensional observations, like images, using priors informed by
Hamiltonian mechanics. While these models have important potential applications
in areas like robotics or autonomous driving, there is currently no good way to
evaluate their performance: existing methods primarily rely on image
reconstruction quality, which does not always reflect the quality of the learnt
latent dynamics. In this work, we empirically highlight the problems with the
existing measures and develop a set of new measures, including a binary
indicator of whether the underlying Hamiltonian dynamics have been faithfully
captured, which we call Symplecticity Metric or SyMetric. Our measures take
advantage of the known properties of Hamiltonian dynamics and are more
discriminative of the model's ability to capture the underlying dynamics than
reconstruction error. Using SyMetric, we identify a set of architectural
choices that significantly improve the performance of a previously proposed
model for inferring latent dynamics from pixels, the Hamiltonian Generative
Network (HGN). Unlike the original HGN, the new HGN++ is able to discover an
interpretable phase space with physically meaningful latents on some datasets.
Furthermore, it is stable for significantly longer rollouts on a diverse range
of 13 datasets, producing rollouts of essentially infinite length both forward
and backwards in time with no degradation in quality on a subset of the
datasets.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Course Correcting Koopman Representations [12.517740162118855]
We study autoencoder formulations of this problem, and different ways they can be used to model dynamics.
We propose an inference-time mechanism, which we refer to as Periodic Reencoding, for faithfully capturing long term dynamics.
arXiv Detail & Related papers (2023-10-23T22:36:31Z) - Understanding Self-attention Mechanism via Dynamical System Perspective [58.024376086269015]
Self-attention mechanism (SAM) is widely used in various fields of artificial intelligence.
We show that intrinsic stiffness phenomenon (SP) in the high-precision solution of ordinary differential equations (ODEs) also widely exists in high-performance neural networks (NN)
We show that the SAM is also a stiffness-aware step size adaptor that can enhance the model's representational ability to measure intrinsic SP.
arXiv Detail & Related papers (2023-08-19T08:17:41Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Deep learning and differential equations for modeling changes in
individual-level latent dynamics between observation periods [0.0]
We propose an extension where different sets of differential equation parameters are allowed for observation sub-periods.
We derive prediction targets from individual dynamic models of resilience in the application.
Our approach is seen to successfully identify individual-level parameters of dynamic models that allows us to stably select predictors.
arXiv Detail & Related papers (2022-02-15T13:53:42Z) - Which priors matter? Benchmarking models for learning latent dynamics [70.88999063639146]
Several methods have proposed to integrate priors from classical mechanics into machine learning models.
We take a sober look at the current capabilities of these models.
We find that the use of continuous and time-reversible dynamics benefits models of all classes.
arXiv Detail & Related papers (2021-11-09T23:48:21Z) - A unified framework for Hamiltonian deep neural networks [3.0934684265555052]
Training deep neural networks (DNNs) can be difficult due to vanishing/exploding gradients during weight optimization.
We propose a class of DNNs stemming from the time discretization of Hamiltonian systems.
The proposed Hamiltonian framework, besides encompassing existing networks inspired by marginally stable ODEs, allows one to derive new and more expressive architectures.
arXiv Detail & Related papers (2021-04-27T13:20:24Z) - Heteroscedastic Uncertainty for Robust Generative Latent Dynamics [7.107159120605662]
We present a method to jointly learn a latent state representation and the associated dynamics.
As our main contribution, we describe how our representation is able to capture a notion of heteroscedastic or input-specific uncertainty.
We present results from prediction and control experiments on two image-based tasks.
arXiv Detail & Related papers (2020-08-18T21:04:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.