Bounded nonlinear forecasts of partially observed geophysical systems
with physics-constrained deep learning
- URL: http://arxiv.org/abs/2202.05750v1
- Date: Fri, 11 Feb 2022 16:40:46 GMT
- Title: Bounded nonlinear forecasts of partially observed geophysical systems
with physics-constrained deep learning
- Authors: Said Ouala, Steven L. Brunton, Ananda Pascual, Bertrand Chapron,
Fabrice Collard, Lucile Gaultier, Ronan Fablet
- Abstract summary: We investigate the physics-constrained learning of implicit dynamical embeddings, leveraging neural ordinary differential equation (NODE) representations.
A key objective is to constrain their boundedness, which promotes the generalization of the learned dynamics to arbitrary initial condition.
- Score: 30.238425143378414
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The complexity of real-world geophysical systems is often compounded by the
fact that the observed measurements depend on hidden variables. These latent
variables include unresolved small scales and/or rapidly evolving processes,
partially observed couplings, or forcings in coupled systems. This is the case
in ocean-atmosphere dynamics, for which unknown interior dynamics can affect
surface observations. The identification of computationally-relevant
representations of such partially-observed and highly nonlinear systems is thus
challenging and often limited to short-term forecast applications. Here, we
investigate the physics-constrained learning of implicit dynamical embeddings,
leveraging neural ordinary differential equation (NODE) representations. A key
objective is to constrain their boundedness, which promotes the generalization
of the learned dynamics to arbitrary initial condition. The proposed
architecture is implemented within a deep learning framework, and its relevance
is demonstrated with respect to state-of-the-art schemes for different
case-studies representative of geophysical dynamics.
Related papers
- Response Estimation and System Identification of Dynamical Systems via Physics-Informed Neural Networks [0.0]
This paper explores the use of Physics-Informed Neural Networks (PINNs) for the identification and estimation of dynamical systems.
PINNs offer a unique advantage by embedding known physical laws directly into the neural network's loss function, allowing for simple embedding of complex phenomena.
The results demonstrate that PINNs deliver an efficient tool across all aforementioned tasks, even in presence of modelling errors.
arXiv Detail & Related papers (2024-10-02T08:58:30Z) - Neural Persistence Dynamics [8.197801260302642]
We consider the problem of learning the dynamics in the topology of time-evolving point clouds.
Our proposed model -- staticitneural persistence dynamics -- substantially outperforms the state-of-the-art across a diverse set of parameter regression tasks.
arXiv Detail & Related papers (2024-05-24T17:20:18Z) - On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - Learning Fine Scale Dynamics from Coarse Observations via Inner
Recurrence [0.0]
Recent work has focused on data-driven learning of the evolution of unknown systems via deep neural networks (DNNs)
This paper presents a computational technique to learn the fine-scale dynamics from such coarsely observed data.
arXiv Detail & Related papers (2022-06-03T20:28:52Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z) - Learning Continuous System Dynamics from Irregularly-Sampled Partial
Observations [33.63818978256567]
We present LG-ODE, a latent ordinary differential equation generative model for modeling multi-agent dynamic system with known graph structure.
It can simultaneously learn the embedding of high dimensional trajectories and infer continuous latent system dynamics.
Our model employs a novel encoder parameterized by a graph neural network that can infer initial states in an unsupervised way.
arXiv Detail & Related papers (2020-11-08T01:02:22Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.