Smooth and Sparse Latent Dynamics in Operator Learning with Jerk
Regularization
- URL: http://arxiv.org/abs/2402.15636v1
- Date: Fri, 23 Feb 2024 22:38:45 GMT
- Title: Smooth and Sparse Latent Dynamics in Operator Learning with Jerk
Regularization
- Authors: Xiaoyu Xie, Saviz Mowlavi, Mouhacine Benosman
- Abstract summary: This paper introduces a continuous operator learning framework that incorporates jagged regularization into the learning of the compressed latent space.
The framework allows for inference at any desired spatial or temporal resolution.
The effectiveness of this framework is demonstrated through a two-dimensional unsteady flow problem governed by the Navier-Stokes equations.
- Score: 1.621267003497711
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spatiotemporal modeling is critical for understanding complex systems across
various scientific and engineering disciplines, but governing equations are
often not fully known or computationally intractable due to inherent system
complexity. Data-driven reduced-order models (ROMs) offer a promising approach
for fast and accurate spatiotemporal forecasting by computing solutions in a
compressed latent space. However, these models often neglect temporal
correlations between consecutive snapshots when constructing the latent space,
leading to suboptimal compression, jagged latent trajectories, and limited
extrapolation ability over time. To address these issues, this paper introduces
a continuous operator learning framework that incorporates jerk regularization
into the learning of the compressed latent space. This jerk regularization
promotes smoothness and sparsity of latent space dynamics, which not only
yields enhanced accuracy and convergence speed but also helps identify
intrinsic latent space coordinates. Consisting of an implicit neural
representation (INR)-based autoencoder and a neural ODE latent dynamics model,
the framework allows for inference at any desired spatial or temporal
resolution. The effectiveness of this framework is demonstrated through a
two-dimensional unsteady flow problem governed by the Navier-Stokes equations,
highlighting its potential to expedite high-fidelity simulations in various
scientific and engineering applications.
Related papers
- Space and Time Continuous Physics Simulation From Partial Observations [0.0]
Data-driven methods based on large-scale machine learning promise high adaptivity by integrating long-range dependencies more directly and efficiently.
We focus on fluid dynamics and address the shortcomings of a large part of the literature, which are based on fixed support for computations and predictions in the form of regular or irregular grids.
We propose a novel setup to perform predictions in a continuous spatial and temporal domain while being trained on sparse observations.
arXiv Detail & Related papers (2024-01-17T13:24:04Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - Evolve Smoothly, Fit Consistently: Learning Smooth Latent Dynamics For
Advection-Dominated Systems [14.553972457854517]
We present a data-driven, space-time continuous framework to learn surrogatemodels for complex physical systems.
We leverage the expressive power of the network and aspecially designed consistency-inducing regularization to obtain latent trajectories that are both low-dimensional and smooth.
arXiv Detail & Related papers (2023-01-25T03:06:03Z) - Implicit Neural Spatial Representations for Time-dependent PDEs [29.404161110513616]
Implicit Neural Spatial Representation (INSR) has emerged as an effective representation of spatially-dependent vector fields.
This work explores solving time-dependent PDEs with INSR.
arXiv Detail & Related papers (2022-09-30T22:46:40Z) - On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver [59.13397937903832]
We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
arXiv Detail & Related papers (2022-08-07T09:02:18Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Neural Ordinary Differential Equations for Data-Driven Reduced Order
Modeling of Environmental Hydrodynamics [4.547988283172179]
We explore the use of Neural Ordinary Differential Equations for fluid flow simulation.
Test problems we consider include incompressible flow around a cylinder and real-world applications of shallow water hydrodynamics in riverine and estuarine systems.
Our findings indicate that Neural ODEs provide an elegant framework for stable and accurate evolution of latent-space dynamics with a promising potential of extrapolatory predictions.
arXiv Detail & Related papers (2021-04-22T19:20:47Z) - Latent-space time evolution of non-intrusive reduced-order models using
Gaussian process emulation [0.6850683267295248]
Non-intrusive reduced-order models (ROMs) provide a low-dimensional framework for systems that may be intrinsically high-dimensional.
In bypassing the utilization of an equation-based evolution, it is often seen that the interpretability of the ROM framework suffers.
In this article, we propose the use of a novel latent-space algorithm based on process regression.
arXiv Detail & Related papers (2020-07-23T17:56:47Z) - First Steps: Latent-Space Control with Semantic Constraints for
Quadruped Locomotion [73.37945453998134]
Traditional approaches to quadruped control employ simplified, hand-derived models.
This significantly reduces the capability of the robot since its effective kinematic range is curtailed.
In this work, these challenges are addressed by framing quadruped control as optimisation in a structured latent space.
A deep generative model captures a statistical representation of feasible joint configurations, whilst complex dynamic and terminal constraints are expressed via high-level, semantic indicators.
We validate the feasibility of locomotion trajectories optimised using our approach both in simulation and on a real-worldmal quadruped.
arXiv Detail & Related papers (2020-07-03T07:04:18Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.