Machine learning structure preserving brackets for forecasting
irreversible processes
- URL: http://arxiv.org/abs/2106.12619v1
- Date: Wed, 23 Jun 2021 18:27:59 GMT
- Title: Machine learning structure preserving brackets for forecasting
irreversible processes
- Authors: Kookjin Lee and Nathaniel A. Trask and Panos Stinis
- Abstract summary: We present a novel parameterization of dissipative brackets from metriplectic dynamical systems.
The process learns generalized Casimirs for energy and entropy guaranteed to be conserved and nondecreasing.
We provide benchmarks for dissipative systems demonstrating learned dynamics are more robust and generalize better than either "black-box" or penalty-based approaches.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Forecasting of time-series data requires imposition of inductive biases to
obtain predictive extrapolation, and recent works have imposed
Hamiltonian/Lagrangian form to preserve structure for systems with reversible
dynamics. In this work we present a novel parameterization of dissipative
brackets from metriplectic dynamical systems appropriate for learning
irreversible dynamics with unknown a priori model form. The process learns
generalized Casimirs for energy and entropy guaranteed to be conserved and
nondecreasing, respectively. Furthermore, for the case of added thermal noise,
we guarantee exact preservation of a fluctuation-dissipation theorem, ensuring
thermodynamic consistency. We provide benchmarks for dissipative systems
demonstrating learned dynamics are more robust and generalize better than
either "black-box" or penalty-based approaches.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - TANGO: Time-Reversal Latent GraphODE for Multi-Agent Dynamical Systems [43.39754726042369]
We propose a simple-yet-effective self-supervised regularization term as a soft constraint that aligns the forward and backward trajectories predicted by a continuous graph neural network-based ordinary differential equation (GraphODE)
It effectively imposes time-reversal symmetry to enable more accurate model predictions across a wider range of dynamical systems under classical mechanics.
Experimental results on a variety of physical systems demonstrate the effectiveness of our proposed method.
arXiv Detail & Related papers (2023-10-10T08:52:16Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Structure-preserving Sparse Identification of Nonlinear Dynamics for
Data-driven Modeling [0.0]
We present a unification of the Sparse Identification of Dynamics (SINDy) formalism with neural ordinary differential equations.
The resulting framework allows learning of both "black-box" dynamics and learning of structure preserving bracket formalisms for both reversible and irreversible dynamics.
arXiv Detail & Related papers (2021-09-11T20:32:10Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Bayesian differential programming for robust systems identification
under uncertainty [14.169588600819546]
This paper presents a machine learning framework for Bayesian systems identification from noisy, sparse and irregular observations of nonlinear dynamical systems.
The proposed method takes advantage of recent developments in differentiable programming to propagate gradient information through ordinary differential equation solvers.
The use of sparsity-promoting priors enables the discovery of interpretable and parsimonious representations for the underlying latent dynamics.
arXiv Detail & Related papers (2020-04-15T00:51:14Z) - On dissipative symplectic integration with applications to
gradient-based optimization [77.34726150561087]
We propose a geometric framework in which discretizations can be realized systematically.
We show that a generalization of symplectic to nonconservative and in particular dissipative Hamiltonian systems is able to preserve rates of convergence up to a controlled error.
arXiv Detail & Related papers (2020-04-15T00:36:49Z) - Structure-preserving neural networks [0.08209843760716957]
We develop a method to learn physical systems from data that employs feedforward neural networks.
The method employs a minimum amount of data by enforcing the metriplectic structure of dissipative Hamiltonian systems.
arXiv Detail & Related papers (2020-04-09T16:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.