Machine Learning-Based Nonlinear Nudging for Chaotic Dynamical Systems
- URL: http://arxiv.org/abs/2508.05778v1
- Date: Thu, 07 Aug 2025 18:45:43 GMT
- Title: Machine Learning-Based Nonlinear Nudging for Chaotic Dynamical Systems
- Authors: Jaemin Oh, Jinsil Lee, Youngjoon Hong,
- Abstract summary: Nudging is an empirical data assimilation technique that incorporates an observation-driven control term into the model dynamics.<n>We propose neural network nudging, a data-driven method for learning nudging terms in nonlinear state space models.
- Score: 2.7855886538423182
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nudging is an empirical data assimilation technique that incorporates an observation-driven control term into the model dynamics. The trajectory of the nudged system approaches the true system trajectory over time, even when the initial conditions differ. For linear state space models, such control terms can be derived under mild assumptions. However, designing effective nudging terms becomes significantly more challenging in the nonlinear setting. In this work, we propose neural network nudging, a data-driven method for learning nudging terms in nonlinear state space models. We establish a theoretical existence result based on the Kazantzis--Kravaris--Luenberger observer theory. The proposed approach is evaluated on three benchmark problems that exhibit chaotic behavior: the Lorenz 96 model, the Kuramoto--Sivashinsky equation, and the Kolmogorov flow.
Related papers
- Learning Physically Consistent Lagrangian Control Models Without Acceleration Measurements [11.581126685402083]
This article focuses on the derivation and identification of physically consistent models, which are essential for model-based control synthesis.<n>Lagrangian or Hamiltonian neural networks provide useful structural guarantees but the learning of such models often leads to inconsistent models.<n>A learning algorithm relying on an original loss function is proposed to improve the physical consistency of Lagrangian systems.
arXiv Detail & Related papers (2025-12-02T18:56:02Z) - PINN-Obs: Physics-Informed Neural Network-Based Observer for Nonlinear Dynamical Systems [2.884893167166808]
This paper introduces a novel Adaptive Physics-Informed Neural Network-based Observer (PINN-Obs) for accurate state estimation in nonlinear systems.<n>Unlike traditional model-based observers, which require explicit system transformations or linearization, the proposed framework directly integrates system dynamics and sensor data into a physics-informed learning process.
arXiv Detail & Related papers (2025-07-09T10:09:45Z) - Nonlinear Model Order Reduction of Dynamical Systems in Process Engineering: Review and Comparison [50.0791489606211]
We review state-of-the-art nonlinear model order reduction methods.<n>We discuss both general-purpose methods and tailored approaches for (chemical) process systems.
arXiv Detail & Related papers (2025-06-15T11:39:12Z) - Certified Neural Approximations of Nonlinear Dynamics [52.79163248326912]
In safety-critical contexts, the use of neural approximations requires formal bounds on their closeness to the underlying system.<n>We propose a novel, adaptive, and parallelizable verification method based on certified first-order models.
arXiv Detail & Related papers (2025-05-21T13:22:20Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Neural Abstractions [72.42530499990028]
We present a novel method for the safety verification of nonlinear dynamical models that uses neural networks to represent abstractions of their dynamics.
We demonstrate that our approach performs comparably to the mature tool Flow* on existing benchmark nonlinear models.
arXiv Detail & Related papers (2023-01-27T12:38:09Z) - Neural ODEs as Feedback Policies for Nonlinear Optimal Control [1.8514606155611764]
We use Neural ordinary differential equations (Neural ODEs) to model continuous time dynamics as differential equations parametrized with neural networks.
We propose the use of a neural control policy posed as a Neural ODE to solve general nonlinear optimal control problems.
arXiv Detail & Related papers (2022-10-20T13:19:26Z) - Log-linear Guardedness and its Implications [116.87322784046926]
Methods for erasing human-interpretable concepts from neural representations that assume linearity have been found to be tractable and useful.
This work formally defines the notion of log-linear guardedness as the inability of an adversary to predict the concept directly from the representation.
We show that, in the binary case, under certain assumptions, a downstream log-linear model cannot recover the erased concept.
arXiv Detail & Related papers (2022-10-18T17:30:02Z) - Learning Reduced Nonlinear State-Space Models: an Output-Error Based
Canonical Approach [8.029702645528412]
We investigate the effectiveness of deep learning in the modeling of dynamic systems with nonlinear behavior.
We show its ability to identify three different nonlinear systems.
The performances are evaluated in terms of open-loop prediction on test data generated in simulation as well as a real world data-set of unmanned aerial vehicle flight measurements.
arXiv Detail & Related papers (2022-04-19T06:33:23Z) - Likelihood-Free Inference in State-Space Models with Unknown Dynamics [71.94716503075645]
We introduce a method for inferring and predicting latent states in state-space models where observations can only be simulated, and transition dynamics are unknown.
We propose a way of doing likelihood-free inference (LFI) of states and state prediction with a limited number of simulations.
arXiv Detail & Related papers (2021-11-02T12:33:42Z) - Constrained Block Nonlinear Neural Dynamical Models [1.3163098563588727]
Neural network modules conditioned by known priors can be effectively trained and combined to represent systems with nonlinear dynamics.
The proposed method consists of neural network blocks that represent input, state, and output dynamics with constraints placed on the network weights and system variables.
We evaluate the performance of the proposed architecture and training methods on system identification tasks for three nonlinear systems.
arXiv Detail & Related papers (2021-01-06T04:27:54Z) - Constrained Neural Ordinary Differential Equations with Stability
Guarantees [1.1086440815804224]
We show how to model discrete ordinary differential equations with algebraic nonlinearities as deep neural networks.
We derive the stability guarantees of the network layers based on the implicit constraints imposed on the weight's eigenvalues.
We demonstrate the prediction accuracy of learned neural ODEs evaluated on open-loop simulations.
arXiv Detail & Related papers (2020-04-22T22:07:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.