Modeling Spatio-temporal Dynamical Systems with Neural Discrete Learning
and Levels-of-Experts
- URL: http://arxiv.org/abs/2402.05970v1
- Date: Tue, 6 Feb 2024 06:27:07 GMT
- Title: Modeling Spatio-temporal Dynamical Systems with Neural Discrete Learning
and Levels-of-Experts
- Authors: Kun Wang, Hao Wu, Guibin Zhang, Junfeng Fang, Yuxuan Liang, Yuankai
Wu, Roger Zimmermann, Yang Wang
- Abstract summary: We address the issue of modeling and estimating changes in the state oftemporal- dynamical systems based on a sequence of observations like video frames.
This paper propose the universal expert module -- that is, optical flow estimation component, to capture the laws of general physical processes in a data-driven fashion.
We conduct extensive experiments and ablations to demonstrate that the proposed framework achieves large performance margins, compared with the existing SOTA baselines.
- Score: 33.335735613579914
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we address the issue of modeling and estimating changes in the
state of the spatio-temporal dynamical systems based on a sequence of
observations like video frames. Traditional numerical simulation systems depend
largely on the initial settings and correctness of the constructed partial
differential equations (PDEs). Despite recent efforts yielding significant
success in discovering data-driven PDEs with neural networks, the limitations
posed by singular scenarios and the absence of local insights prevent them from
performing effectively in a broader real-world context. To this end, this paper
propose the universal expert module -- that is, optical flow estimation
component, to capture the evolution laws of general physical processes in a
data-driven fashion. To enhance local insight, we painstakingly design a
finer-grained physical pipeline, since local characteristics may be influenced
by various internal contextual information, which may contradict the
macroscopic properties of the whole system. Further, we harness currently
popular neural discrete learning to unveil the underlying important features in
its latent space, this process better injects interpretability, which can help
us obtain a powerful prior over these discrete random variables. We conduct
extensive experiments and ablations to demonstrate that the proposed framework
achieves large performance margins, compared with the existing SOTA baselines.
Related papers
- Neural Incremental Data Assimilation [8.817223931520381]
We introduce a deep learning approach where the physical system is modeled as a sequence of coarse-to-fine Gaussian prior distributions parametrized by a neural network.
This allows us to define an assimilation operator, which is trained in an end-to-end fashion to minimize the reconstruction error.
We illustrate our approach on chaotic dynamical physical systems with sparse observations, and compare it to traditional variational data assimilation methods.
arXiv Detail & Related papers (2024-06-21T11:42:55Z) - eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - Deep Learning-based Analysis of Basins of Attraction [49.812879456944984]
This research addresses the challenge of characterizing the complexity and unpredictability of basins within various dynamical systems.
The main focus is on demonstrating the efficiency of convolutional neural networks (CNNs) in this field.
arXiv Detail & Related papers (2023-09-27T15:41:12Z) - Understanding Self-attention Mechanism via Dynamical System Perspective [58.024376086269015]
Self-attention mechanism (SAM) is widely used in various fields of artificial intelligence.
We show that intrinsic stiffness phenomenon (SP) in the high-precision solution of ordinary differential equations (ODEs) also widely exists in high-performance neural networks (NN)
We show that the SAM is also a stiffness-aware step size adaptor that can enhance the model's representational ability to measure intrinsic SP.
arXiv Detail & Related papers (2023-08-19T08:17:41Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Learning dynamics from partial observations with structured neural ODEs [5.757156314867639]
We propose a flexible framework to incorporate a broad spectrum of physical insight into neural ODE-based system identification.
We demonstrate the performance of the proposed approach on numerical simulations and on an experimental dataset from a robotic exoskeleton.
arXiv Detail & Related papers (2022-05-25T07:54:10Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Disentangled Generative Models for Robust Prediction of System Dynamics [2.6424064030995957]
In this work, we treat the domain parameters of dynamical systems as factors of variation of the data generating process.
By leveraging ideas from supervised disentanglement and causal factorization, we aim to separate the domain parameters from the dynamics in the latent space of generative models.
Results indicate that disentangled VAEs adapt better to domain parameters spaces that were not present in the training data.
arXiv Detail & Related papers (2021-08-26T09:58:06Z) - Meta-learning using privileged information for dynamics [66.32254395574994]
We extend the Neural ODE Process model to use additional information within the Learning Using Privileged Information setting.
We validate our extension with experiments showing improved accuracy and calibration on simulated dynamics tasks.
arXiv Detail & Related papers (2021-04-29T12:18:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.