Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit
Constraints
- URL: http://arxiv.org/abs/2010.13581v1
- Date: Mon, 26 Oct 2020 13:35:16 GMT
- Title: Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit
Constraints
- Authors: Marc Finzi, Ke Alexander Wang, Andrew Gordon Wilson
- Abstract summary: We introduce a series of challenging chaotic and extended-body systems to push the limits of current approaches.
Our experiments show that Cartesian coordinates with explicit constraints lead to a 100x improvement in accuracy and data efficiency.
- Score: 49.66841118264278
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reasoning about the physical world requires models that are endowed with the
right inductive biases to learn the underlying dynamics. Recent works improve
generalization for predicting trajectories by learning the Hamiltonian or
Lagrangian of a system rather than the differential equations directly. While
these methods encode the constraints of the systems using generalized
coordinates, we show that embedding the system into Cartesian coordinates and
enforcing the constraints explicitly with Lagrange multipliers dramatically
simplifies the learning problem. We introduce a series of challenging chaotic
and extended-body systems, including systems with N-pendulums, spring coupling,
magnetic fields, rigid rotors, and gyroscopes, to push the limits of current
approaches. Our experiments show that Cartesian coordinates with explicit
constraints lead to a 100x improvement in accuracy and data efficiency.
Related papers
- A Bayesian framework for discovering interpretable Lagrangian of
dynamical systems from data [1.0878040851638]
We propose an alternate framework for learning interpretable Lagrangian descriptions of physical systems.
Unlike existing neural network-based approaches, the proposed approach yields an interpretable description of Lagrangian.
arXiv Detail & Related papers (2023-10-10T01:35:54Z) - Deep Learning for Structure-Preserving Universal Stable Koopman-Inspired
Embeddings for Nonlinear Canonical Hamiltonian Dynamics [9.599029891108229]
We focus on the identification of global linearized embeddings for canonical nonlinear Hamiltonian systems through a symplectic transformation.
To overcome the shortcomings of Koopman operators for systems with continuous spectra, we apply the lifting principle and learn global cubicized embeddings.
We demonstrate the capabilities of deep learning in acquiring compact symplectic coordinate transformation and the corresponding simple dynamical models.
arXiv Detail & Related papers (2023-08-26T09:58:09Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - Discovering interpretable Lagrangian of dynamical systems from data [0.0]
Recent trends in representation learning involve learning Lagrangian from data rather than the direct discovery of governing equations of motion.
We propose a novel data-driven machine-learning algorithm to automate the discovery of interpretable Lagrangian from data.
arXiv Detail & Related papers (2023-02-09T01:57:05Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Learning Hamiltonians of constrained mechanical systems [0.0]
Hamiltonian systems are an elegant and compact formalism in classical mechanics.
We propose new approaches for the accurate approximation of the Hamiltonian function of constrained mechanical systems.
arXiv Detail & Related papers (2022-01-31T14:03:17Z) - Deep Learning Approximation of Diffeomorphisms via Linear-Control
Systems [91.3755431537592]
We consider a control system of the form $dot x = sum_i=1lF_i(x)u_i$, with linear dependence in the controls.
We use the corresponding flow to approximate the action of a diffeomorphism on a compact ensemble of points.
arXiv Detail & Related papers (2021-10-24T08:57:46Z) - Fast Gravitational Approach for Rigid Point Set Registration with
Ordinary Differential Equations [79.71184760864507]
This article introduces a new physics-based method for rigid point set alignment called Fast Gravitational Approach (FGA)
In FGA, the source and target point sets are interpreted as rigid particle swarms with masses interacting in a globally multiply-linked manner while moving in a simulated gravitational force field.
We show that the new method class has characteristics not found in previous alignment methods.
arXiv Detail & Related papers (2020-09-28T15:05:39Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.