Physics-aware, probabilistic model order reduction with guaranteed
stability
- URL: http://arxiv.org/abs/2101.05834v1
- Date: Thu, 14 Jan 2021 19:16:51 GMT
- Title: Physics-aware, probabilistic model order reduction with guaranteed
stability
- Authors: Sebastian Kaltenbach, Phaedon-Stelios Koutsourelakis
- Abstract summary: We propose a generative framework for learning an effective, lower-dimensional, coarse-grained dynamical model.
We demonstrate its efficacy and accuracy in multiscale physical systems of particle dynamics.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Given (small amounts of) time-series' data from a high-dimensional,
fine-grained, multiscale dynamical system, we propose a generative framework
for learning an effective, lower-dimensional, coarse-grained dynamical model
that is predictive of the fine-grained system's long-term evolution but also of
its behavior under different initial conditions. We target fine-grained models
as they arise in physical applications (e.g. molecular dynamics, agent-based
models), the dynamics of which are strongly non-stationary but their transition
to equilibrium is governed by unknown slow processes which are largely
inaccessible by brute-force simulations. Approaches based on domain knowledge
heavily rely on physical insight in identifying temporally slow features and
fail to enforce the long-term stability of the learned dynamics. On the other
hand, purely statistical frameworks lack interpretability and rely on large
amounts of expensive simulation data (long and multiple trajectories) as they
cannot infuse domain knowledge. The generative framework proposed achieves the
aforementioned desiderata by employing a flexible prior on the complex plane
for the latent, slow processes, and an intermediate layer of physics-motivated
latent variables that reduces reliance on data and imbues inductive bias. In
contrast to existing schemes, it does not require the a priori definition of
projection operators from the fine-grained description and addresses
simultaneously the tasks of dimensionality reduction and model estimation. We
demonstrate its efficacy and accuracy in multiscale physical systems of
particle dynamics where probabilistic, long-term predictions of phenomena not
contained in the training data are produced.
Related papers
- Learning Physics From Video: Unsupervised Physical Parameter Estimation for Continuous Dynamical Systems [49.11170948406405]
State-of-the-art in automatic parameter estimation from video is addressed by training supervised deep networks on large datasets.
We propose a method to estimate the physical parameters of any known, continuous governing equation from single videos.
arXiv Detail & Related papers (2024-10-02T09:44:54Z) - eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Hard Encoding of Physics for Learning Spatiotemporal Dynamics [8.546520029145853]
We propose a deep learning architecture that forcibly encodes known physics knowledge to facilitate learning in a data-driven manner.
The coercive encoding mechanism of physics, which is fundamentally different from the penalty-based physics-informed learning, ensures the network to rigorously obey given physics.
arXiv Detail & Related papers (2021-05-02T21:40:39Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Physics-aware, deep probabilistic modeling of multiscale dynamics in the
Small Data regime [0.0]
The present paper offers a probabilistic perspective that simultaneously identifies predictive, lower-dimensional coarse-grained (CG) variables as well as their dynamics.
We make use of the expressive ability of deep neural networks in order to represent the right-hand side of the CG evolution law.
We demonstrate the efficacy of the proposed framework in a high-dimensional system of moving particles.
arXiv Detail & Related papers (2021-02-08T15:04:05Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Incorporating physical constraints in a deep probabilistic machine
learning framework for coarse-graining dynamical systems [7.6146285961466]
This paper offers a data-based, probablistic perspective that enables the quantification of predictive uncertainties.
We formulate the coarse-graining process by employing a probabilistic state-space model.
It is capable of reconstructing the evolution of the full, fine-scale system.
arXiv Detail & Related papers (2019-12-30T16:07:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.