Delay compensation of multi-input distinct delay nonlinear systems via neural operators
- URL: http://arxiv.org/abs/2509.17131v1
- Date: Sun, 21 Sep 2025 15:46:46 GMT
- Title: Delay compensation of multi-input distinct delay nonlinear systems via neural operators
- Authors: Filip Bajraktari, Luke Bhan, Miroslav Krstic, Yuanyuan Shi,
- Abstract summary: We show that if the predictor approximation satisfies a uniform (in time) error bound, semi-global practical stability is correspondingly achieved.<n>For such approximators, the required uniform error bound depends on the desired region of attraction and the number of control inputs in the system.
- Score: 5.578049844940438
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we present the first stability results for approximate predictors in multi-input non-linear systems with distinct actuation delays. We show that if the predictor approximation satisfies a uniform (in time) error bound, semi-global practical stability is correspondingly achieved. For such approximators, the required uniform error bound depends on the desired region of attraction and the number of control inputs in the system. The result is achieved through transforming the delay into a transport PDE and conducting analysis on the coupled ODE-PDE cascade. To highlight the viability of such error bounds, we demonstrate our results on a class of approximators - neural operators - showcasing sufficiency for satisfying such a universal bound both theoretically and in simulation on a mobile robot experiment.
Related papers
- The Best of Both Worlds: Hybridizing Neural Operators and Solvers for Stable Long-Horizon Inference [0.0]
ANCHOR is an online, instance-aware hybrid inference framework for stable long-horizon prediction of PDEs.<n>We show that ANCHOR reliably bounds long-horizon error growth, stabilizes extrapolative rollouts, and significantly improves robustness over standalone neural operators.
arXiv Detail & Related papers (2025-12-22T18:17:28Z) - Generative Modeling with Continuous Flows: Sample Complexity of Flow Matching [60.37045080890305]
We provide the first analysis of the sample complexity for flow-matching based generative models.<n>We decompose the velocity field estimation error into neural-network approximation error, statistical error due to the finite sample size, and optimization error due to the finite number of optimization steps for estimating the velocity field.
arXiv Detail & Related papers (2025-12-01T05:14:25Z) - Revisiting Zeroth-Order Optimization: Minimum-Variance Two-Point Estimators and Directionally Aligned Perturbations [57.179679246370114]
We identify the distribution of random perturbations that minimizes the estimator's variance as the perturbation stepsize tends to zero.<n>Our findings reveal that such desired perturbations can align directionally with the true gradient, instead of maintaining a fixed length.
arXiv Detail & Related papers (2025-10-22T19:06:39Z) - Stabilization of nonlinear systems with unknown delays via delay-adaptive neural operator approximate predictors [6.093618731228799]
This work establishes the first rigorous stability guarantees for approximate predictors in delay-adaptive control of nonlinear systems.<n>We show that neural operators-a flexible class of neural network-based approximators-can achieve arbitrarily small approximation errors.
arXiv Detail & Related papers (2025-09-30T16:00:58Z) - Delay-adaptive Control of Nonlinear Systems with Approximate Neural Operator Predictors [6.093618731228799]
We propose a rigorous method for implementing predictor feedback controllers in nonlinear systems with unknown and arbitrarily long actuator delays.<n>To address the analytically intractable nature of the predictor, we approximate it using a learned neural operator mapping.<n>We provide a theoretical stability analysis based on the universal approximation theorem of neural operators and the transport partial differential equation (PDE) representation of the delay.<n>We then prove, via a Lyapunov-Krasovskii functional, semi-global practical convergence of the dynamical system dependent on the approximation error of the predictor and delay bounds.
arXiv Detail & Related papers (2025-08-28T02:30:53Z) - MultiPDENet: PDE-embedded Learning with Multi-time-stepping for Accelerated Flow Simulation [48.41289705783405]
We propose a PDE-embedded network with multiscale time stepping (MultiPDENet)<n>In particular, we design a convolutional filter based on the structure of finite difference with a small number of parameters to optimize.<n>A Physics Block with a 4th-order Runge-Kutta integrator at the fine time scale is established that embeds the structure of PDEs to guide the prediction.
arXiv Detail & Related papers (2025-01-27T12:15:51Z) - Neural Operators for Predictor Feedback Control of Nonlinear Delay Systems [3.0248879829045388]
We recast the predictor design as an operator learning problem, and learn the predictor mapping via a neural operator.<n>Under the approximated predictor, we achieve semiglobal practical stability of the closed-loop nonlinear delay system.<n>We demonstrate the approach by controlling a 5-link robotic manipulator with different neural operator models.
arXiv Detail & Related papers (2024-11-28T07:30:26Z) - Adaptive control of reaction-diffusion PDEs via neural operator-approximated gain kernels [3.3044728148521623]
Neural operator approximations of the gain kernels in PDE backstepping have emerged as a viable method for implementing controllers in real time.<n>We extend the neural operator methodology from adaptive control of a hyperbolic PDE to adaptive control of a benchmark parabolic PDE.<n>We prove global stability and regulation of the plant state for a Lyapunov design of parameter adaptation.
arXiv Detail & Related papers (2024-07-01T19:24:36Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Efficient Semi-Implicit Variational Inference [65.07058307271329]
We propose an efficient and scalable semi-implicit extrapolational (SIVI)
Our method maps SIVI's evidence to a rigorous inference of lower gradient values.
arXiv Detail & Related papers (2021-01-15T11:39:09Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.