Hierarchical Physics-Embedded Learning for Spatiotemporal Dynamical Systems
- URL: http://arxiv.org/abs/2510.25306v1
- Date: Wed, 29 Oct 2025 09:18:41 GMT
- Title: Hierarchical Physics-Embedded Learning for Spatiotemporal Dynamical Systems
- Authors: Xizhe Wang, Xiaobin Song, Qingshan Jia, Hongbo Zhao, Benben Jiang,
- Abstract summary: We propose a hierarchical physics-embedded learning framework that advances both forwardtemporal prediction and inverse discovery of physical laws.<n>Known physical laws are directly embedded into the models computational graph, guaranteeing physical consistency.<n>By building the framework upon adaptive Neural Operators, we can effectively capture the non-local dependencies and high-order operators characteristic of dynamical systems.
- Score: 12.832325257647128
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Modeling complex spatiotemporal dynamics, particularly in far-from-equilibrium systems, remains a grand challenge in science. The governing partial differential equations (PDEs) for these systems are often intractable to derive from first principles, due to their inherent complexity, characterized by high-order derivatives and strong nonlinearities, coupled with incomplete physical knowledge. This has spurred the development of data-driven methods, yet these approaches face limitations: Purely data-driven models are often physically inconsistent and data-intensive, while existing physics-informed methods lack the structural capacity to represent complex operators or systematically integrate partial physical knowledge. Here, we propose a hierarchical physics-embedded learning framework that fundamentally advances both the forward spatiotemporal prediction and inverse discovery of physical laws from sparse and noisy data. The key innovation is a two-level architecture that mirrors the process of scientific discovery: the first level learns fundamental symbolic components of a PDE, while the second learns their governing combinations. This hierarchical decomposition not only reduces learning complexity but, more importantly, enables a structural integration of prior knowledge. Known physical laws are directly embedded into the models computational graph, guaranteeing physical consistency and improving data efficiency. By building the framework upon adaptive Fourier Neural Operators, we can effectively capture the non-local dependencies and high-order operators characteristic of dynamical systems. Additionally, by structurally decoupling known and unknown terms, the framework further enables interpretable discovery of underlying governing equations through symbolic regression, without presupposing functional forms.
Related papers
- Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge [8.269904705399474]
Recent advances in machine learning have enabled neural operators to serve as powerful surrogates for modeling the evolution of physical systems.<n>We propose a multiphysics training framework that jointly learns from both the original PDEs and their simplified basic forms.<n>Our framework enhances data efficiency, reduces predictive errors, and improves out-of-distribution (OOD) generalization.
arXiv Detail & Related papers (2026-02-16T20:45:10Z) - Information-Theoretic Bounds and Task-Centric Learning Complexity for Real-World Dynamic Nonlinear Systems [0.6875312133832079]
Dynamic nonlinear systems exhibit distortions arising from coupled static and dynamic effects.<n>This paper presents a theoretical framework grounded in structured decomposition, variance analysis, and task-centric complexity bounds.
arXiv Detail & Related papers (2025-09-08T12:08:02Z) - Conservation-informed Graph Learning for Spatiotemporal Dynamics Prediction [84.26340606752763]
In this paper, we introduce the conservation-informed GNN (CiGNN), an end-to-end explainable learning framework.<n>The network is designed to conform to the general symmetry conservation law via symmetry where conservative and non-conservative information passes over a multiscale space by a latent temporal marching strategy.<n>Results demonstrate that CiGNN exhibits remarkable baseline accuracy and generalizability, and is readily applicable to learning for prediction of varioustemporal dynamics.
arXiv Detail & Related papers (2024-12-30T13:55:59Z) - From Lazy to Rich: Exact Learning Dynamics in Deep Linear Networks [47.13391046553908]
In artificial networks, the effectiveness of these models relies on their ability to build task specific representation.<n>Prior studies highlight that different initializations can place networks in either a lazy regime, where representations remain static, or a rich/feature learning regime, where representations evolve dynamically.<n>These solutions capture the evolution of representations and the Neural Kernel across the spectrum from the rich to the lazy regimes.
arXiv Detail & Related papers (2024-09-22T23:19:04Z) - Data-Driven Computing Methods for Nonlinear Physics Systems with Geometric Constraints [0.7252027234425334]
This paper introduces a novel, data-driven framework that synergizes physics-based priors with advanced machine learning techniques.
Our framework showcases four algorithms, each embedding a specific physics-based prior tailored to a particular class of nonlinear systems.
The integration of these priors also enhances the expressive power of neural networks, enabling them to capture complex patterns typical in physical phenomena.
arXiv Detail & Related papers (2024-06-20T23:10:41Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - DySMHO: Data-Driven Discovery of Governing Equations for Dynamical
Systems via Moving Horizon Optimization [77.34726150561087]
We introduce Discovery of Dynamical Systems via Moving Horizon Optimization (DySMHO), a scalable machine learning framework.
DySMHO sequentially learns the underlying governing equations from a large dictionary of basis functions.
Canonical nonlinear dynamical system examples are used to demonstrate that DySMHO can accurately recover the governing laws.
arXiv Detail & Related papers (2021-07-30T20:35:03Z) - Hard Encoding of Physics for Learning Spatiotemporal Dynamics [8.546520029145853]
We propose a deep learning architecture that forcibly encodes known physics knowledge to facilitate learning in a data-driven manner.
The coercive encoding mechanism of physics, which is fundamentally different from the penalty-based physics-informed learning, ensures the network to rigorously obey given physics.
arXiv Detail & Related papers (2021-05-02T21:40:39Z) - Physics-informed learning of governing equations from scarce data [14.95055620484844]
This work introduces a physics-informed deep learning framework to discover governing partial differential equations (PDEs) from scarce and noisy representation data.
The efficacy and robustness of this method are demonstrated, both numerically and experimentally, on discovering a variety of PDE systems.
The resulting computational framework shows the potential for closed-form model discovery in practical applications.
arXiv Detail & Related papers (2020-05-05T22:13:22Z) - Learning Stable Deep Dynamics Models [91.90131512825504]
We propose an approach for learning dynamical systems that are guaranteed to be stable over the entire state space.
We show that such learning systems are able to model simple dynamical systems and can be combined with additional deep generative models to learn complex dynamics.
arXiv Detail & Related papers (2020-01-17T00:04:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.