Incorporating physical constraints in a deep probabilistic machine
learning framework for coarse-graining dynamical systems
- URL: http://arxiv.org/abs/1912.12976v4
- Date: Wed, 13 May 2020 13:17:29 GMT
- Title: Incorporating physical constraints in a deep probabilistic machine
learning framework for coarse-graining dynamical systems
- Authors: Sebastian Kaltenbach, Phaedon-Stelios Koutsourelakis
- Abstract summary: This paper offers a data-based, probablistic perspective that enables the quantification of predictive uncertainties.
We formulate the coarse-graining process by employing a probabilistic state-space model.
It is capable of reconstructing the evolution of the full, fine-scale system.
- Score: 7.6146285961466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data-based discovery of effective, coarse-grained (CG) models of
high-dimensional dynamical systems presents a unique challenge in computational
physics and particularly in the context of multiscale problems. The present
paper offers a data-based, probablistic perspective that enables the
quantification of predictive uncertainties. One of the outstanding problems has
been the introduction of physical constraints in the probabilistic machine
learning objectives. The primary utility of such constraints stems from the
undisputed physical laws such as conservation of mass, energy etc. that they
represent. Furthermore and apart from leading to physically realistic
predictions, they can significantly reduce the requisite amount of training
data which for high-dimensional, multiscale systems are expensive to obtain
(Small Data regime). We formulate the coarse-graining process by employing a
probabilistic state-space model and account for the aforementioned equality
constraints as virtual observables in the associated densities. We demonstrate
how probabilistic inference tools can be employed to identify the
coarse-grained variables in combination with deep neural nets and their
evolution model without ever needing to define a fine-to-coarse (restriction)
projection and without needing time-derivatives of state variables.
Furthermore, it is capable of reconstructing the evolution of the full,
fine-scale system and therefore the observables of interest need not be
selected a priori. We demonstrate the efficacy of the proposed framework by
applying it to systems of interacting particles and an image-series of a
nonlinear pendulum.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Learning Physics From Video: Unsupervised Physical Parameter Estimation for Continuous Dynamical Systems [49.11170948406405]
State-of-the-art in automatic parameter estimation from video is addressed by training supervised deep networks on large datasets.
We propose a method to estimate the physical parameters of any known, continuous governing equation from single videos.
arXiv Detail & Related papers (2024-10-02T09:44:54Z) - Neural Incremental Data Assimilation [8.817223931520381]
We introduce a deep learning approach where the physical system is modeled as a sequence of coarse-to-fine Gaussian prior distributions parametrized by a neural network.
This allows us to define an assimilation operator, which is trained in an end-to-end fashion to minimize the reconstruction error.
We illustrate our approach on chaotic dynamical physical systems with sparse observations, and compare it to traditional variational data assimilation methods.
arXiv Detail & Related papers (2024-06-21T11:42:55Z) - Interpretable Self-Aware Neural Networks for Robust Trajectory
Prediction [50.79827516897913]
We introduce an interpretable paradigm for trajectory prediction that distributes the uncertainty among semantic concepts.
We validate our approach on real-world autonomous driving data, demonstrating superior performance over state-of-the-art baselines.
arXiv Detail & Related papers (2022-11-16T06:28:20Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Physics-aware, deep probabilistic modeling of multiscale dynamics in the
Small Data regime [0.0]
The present paper offers a probabilistic perspective that simultaneously identifies predictive, lower-dimensional coarse-grained (CG) variables as well as their dynamics.
We make use of the expressive ability of deep neural networks in order to represent the right-hand side of the CG evolution law.
We demonstrate the efficacy of the proposed framework in a high-dimensional system of moving particles.
arXiv Detail & Related papers (2021-02-08T15:04:05Z) - Physics-aware, probabilistic model order reduction with guaranteed
stability [0.0]
We propose a generative framework for learning an effective, lower-dimensional, coarse-grained dynamical model.
We demonstrate its efficacy and accuracy in multiscale physical systems of particle dynamics.
arXiv Detail & Related papers (2021-01-14T19:16:51Z) - General stochastic separation theorems with optimal bounds [68.8204255655161]
Phenomenon of separability was revealed and used in machine learning to correct errors of Artificial Intelligence (AI) systems and analyze AI instabilities.
Errors or clusters of errors can be separated from the rest of the data.
The ability to correct an AI system also opens up the possibility of an attack on it, and the high dimensionality induces vulnerabilities caused by the same separability.
arXiv Detail & Related papers (2020-10-11T13:12:41Z) - A probabilistic generative model for semi-supervised training of
coarse-grained surrogates and enforcing physical constraints through virtual
observables [3.8073142980733]
This paper provides a flexible, probabilistic framework that accounts for physical structure and information both in the training objectives and in the surrogate model itself.
We advocate a probabilistic model in which equalities that are available from the physics can be introduced as virtual observables and can provide additional information through the likelihood.
arXiv Detail & Related papers (2020-06-02T17:14:36Z) - Embedded-physics machine learning for coarse-graining and collective
variable discovery without data [3.222802562733787]
We present a novel learning framework that consistently embeds underlying physics.
We propose a novel objective based on reverse Kullback-Leibler divergence that fully incorporates the available physics in the form of the atomistic force field.
We demonstrate the algorithmic advances in terms of predictive ability and the physical meaning of the revealed CVs for a bimodal potential energy function and the alanine dipeptide.
arXiv Detail & Related papers (2020-02-24T10:28:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.