Hamiltonian GAN
- URL: http://arxiv.org/abs/2308.11216v1
- Date: Tue, 22 Aug 2023 06:03:00 GMT
- Title: Hamiltonian GAN
- Authors: Christine Allen-Blanchette
- Abstract summary: We present a GAN-based video generation pipeline with a learned configuration space map and Hamiltonian neural network motion model.
We train our model with a physics-inspired cyclic-inspired loss function which encourages a minimal representation of the configuration space and improves interpretability.
- Score: 1.6589012298747952
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A growing body of work leverages the Hamiltonian formalism as an inductive
bias for physically plausible neural network based video generation. The
structure of the Hamiltonian ensures conservation of a learned quantity (e.g.,
energy) and imposes a phase-space interpretation on the low-dimensional
manifold underlying the input video. While this interpretation has the
potential to facilitate the integration of learned representations in
downstream tasks, existing methods are limited in their applicability as they
require a structural prior for the configuration space at design time. In this
work, we present a GAN-based video generation pipeline with a learned
configuration space map and Hamiltonian neural network motion model, to learn a
representation of the configuration space from data. We train our model with a
physics-inspired cyclic-coordinate loss function which encourages a minimal
representation of the configuration space and improves interpretability. We
demonstrate the efficacy and advantages of our approach on the Hamiltonian
Dynamics Suite Toy Physics dataset.
Related papers
- Learning interactions between Rydberg atoms [4.17037025217542]
We introduce a scalable approach to Hamiltonian learning using graph neural networks (GNNs)
We demonstrate that our GNN model has a remarkable capacity to extrapolate beyond its training domain.
arXiv Detail & Related papers (2024-12-16T17:45:30Z) - Hamiltonian Mechanics of Feature Learning: Bottleneck Structure in Leaky ResNets [58.460298576330835]
We study Leaky ResNets, which interpolate between ResNets and Fully-Connected nets depending on an 'effective depth'
We leverage this intuition to explain the emergence of a bottleneck structure, as observed in previous work.
arXiv Detail & Related papers (2024-05-27T18:15:05Z) - Injecting Hamiltonian Architectural Bias into Deep Graph Networks for Long-Range Propagation [55.227976642410766]
dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning.
Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks.
We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors.
arXiv Detail & Related papers (2024-05-27T13:36:50Z) - Applications of Machine Learning to Modelling and Analysing Dynamical
Systems [0.0]
We propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks.
This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics.
We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.
arXiv Detail & Related papers (2023-07-22T19:04:17Z) - Lagrangian Density Space-Time Deep Neural Network Topology [0.0]
We have proposed a "Lagrangian Density Space-Time Deep Neural Networks" (LDDNN) topology.
It is qualified for unsupervised training and learning to predict the dynamics of underlying physical science governed phenomena.
This article will discuss statistical physics interpretation of neural networks in the Lagrangian and Hamiltonian domains.
arXiv Detail & Related papers (2022-06-30T03:29:35Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Dissipative Hamiltonian Neural Networks: Learning Dissipative and
Conservative Dynamics Separately [1.52292571922932]
Recent work has shown that neural networks can learn such symmetries directly from data using Hamiltonian Neural Networks (HNNs)
In this paper, we ask whether it is possible to identify and decompose conservative and dissipative dynamics simultaneously.
We propose Dissipative Hamiltonian Neural Networks (D-HNNs), which parameterize both a Hamiltonian and a Rayleigh dissipation function. Taken together, they represent an implicit Helmholtz decomposition which can separate dissipative effects such as friction from symmetries such as conservation of energy.
arXiv Detail & Related papers (2022-01-25T04:09:11Z) - Hamiltonian prior to Disentangle Content and Motion in Image Sequences [2.2133187119466116]
We present a deep latent variable model for high dimensional sequential data.
We split the motion space into subspaces, and introduce a unique Hamiltonian operator for each subspace.
The explicit split of the motion space decomposes the Hamiltonian into symmetry groups and gives long-term separability.
arXiv Detail & Related papers (2021-12-02T23:41:12Z) - SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred
from Vision [73.26414295633846]
A recently proposed class of models attempts to learn latent dynamics from high-dimensional observations.
Existing methods rely on image reconstruction quality, which does not always reflect the quality of the learnt latent dynamics.
We develop a set of new measures, including a binary indicator of whether the underlying Hamiltonian dynamics have been faithfully captured.
arXiv Detail & Related papers (2021-11-10T23:26:58Z) - Machine Learning S-Wave Scattering Phase Shifts Bypassing the Radial
Schr\"odinger Equation [77.34726150561087]
We present a proof of concept machine learning model resting on a convolutional neural network capable to yield accurate scattering s-wave phase shifts.
We discuss how the Hamiltonian can serve as a guiding principle in the construction of a physically-motivated descriptor.
arXiv Detail & Related papers (2021-06-25T17:25:38Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Symplectic ODE-Net: Learning Hamiltonian Dynamics with Control [14.24939133094439]
We introduce Symplectic ODE-Net (SymODEN), a deep learning framework which can infer the dynamics of a physical system.
In particular, we enforce Hamiltonian dynamics with control to learn the underlying dynamics in a transparent way.
This framework, by offering interpretable, physically-consistent models for physical systems, opens up new possibilities for synthesizing model-based control strategies.
arXiv Detail & Related papers (2019-09-26T13:13:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.