HOUND: High-Order Universal Numerical Differentiator for a Parameter-free Polynomial Online Approximation
- URL: http://arxiv.org/abs/2411.00794v1
- Date: Fri, 18 Oct 2024 13:42:01 GMT
- Title: HOUND: High-Order Universal Numerical Differentiator for a Parameter-free Polynomial Online Approximation
- Authors: Igor Katrichek,
- Abstract summary: This paper introduces a numerical differentiator, represented as a system of nonlinear differential equations of any high order.
We demonstrate that, with a suitable choice of differentiator order, the error converges to zero for signals with additive white noise.
A notable advantage of this numerical differentiation is that it does not require tuning parameters based on the specific characteristics of the signal being differentiated.
- Score: 0.0
- License:
- Abstract: This paper introduces a scalar numerical differentiator, represented as a system of nonlinear differential equations of any high order. We derive the explicit solution for this system and demonstrate that, with a suitable choice of differentiator order, the error converges to zero for polynomial signals with additive white noise. In more general cases, the error remains bounded, provided that the highest estimated derivative is also bounded. A notable advantage of this numerical differentiation method is that it does not require tuning parameters based on the specific characteristics of the signal being differentiated. We propose a discretization method for the equations that implements a cumulative smoothing algorithm for time series. This algorithm operates online, without the need for data accumulation, and it solves both interpolation and extrapolation problems without fitting any coefficients to the data.
Related papers
- Physics-informed AI and ML-based sparse system identification algorithm for discovery of PDE's representing nonlinear dynamic systems [0.0]
The proposed method is demonstrated to discover various differential equations at various noise levels, including three-dimensional, fourth-order, and stiff equations.
The parameter estimation converges accurately to the true values with a small coefficient of variation, suggesting robustness to the noise.
arXiv Detail & Related papers (2024-10-13T21:48:51Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - WeakIdent: Weak formulation for Identifying Differential Equations using
Narrow-fit and Trimming [5.027714423258538]
We propose a general and robust framework to recover differential equations using a weak formulation.
For each sparsity level, Subspace Pursuit is utilized to find an initial set of support from the large dictionary.
The proposed method gives a robust recovery of the coefficients, and a significant denoising effect which can handle up to $100%$ noise-to-signal ratio.
arXiv Detail & Related papers (2022-11-06T14:33:22Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Decoupling multivariate functions using a nonparametric filtered tensor
decomposition [0.29360071145551075]
Decoupling techniques aim at providing an alternative representation of the nonlinearity.
The so-called decoupled form is often a more efficient parameterisation of the relationship while being highly structured, favouring interpretability.
In this work two new algorithms, based on filtered tensor decompositions of first order derivative information are introduced.
arXiv Detail & Related papers (2022-05-23T09:34:17Z) - Automated differential equation solver based on the parametric
approximation optimization [77.34726150561087]
The article presents a method that uses an optimization algorithm to obtain a solution using the parameterized approximation.
It allows solving the wide class of equations in an automated manner without the algorithm's parameters change.
arXiv Detail & Related papers (2022-05-11T10:06:47Z) - Online Weak-form Sparse Identification of Partial Differential Equations [0.5156484100374058]
This paper presents an online algorithm for identification of partial differential equations (PDEs) based on the weak-form sparse identification of nonlinear dynamics algorithm (WSINDy)
The core of the method combines a weak-form discretization of candidate PDEs with an online proximal gradient descent approach to the sparse regression problem.
arXiv Detail & Related papers (2022-03-08T10:11:09Z) - Numerical Solution of Stiff Ordinary Differential Equations with Random
Projection Neural Networks [0.0]
We propose a numerical scheme based on Random Projection Neural Networks (RPNN) for the solution of Ordinary Differential Equations (ODEs)
We show that our proposed scheme yields good numerical approximation accuracy without being affected by the stiffness, thus outperforming in same cases the textttode45 and textttode15s functions.
arXiv Detail & Related papers (2021-08-03T15:49:17Z) - Learning Linearized Assignment Flows for Image Labeling [70.540936204654]
We introduce a novel algorithm for estimating optimal parameters of linearized assignment flows for image labeling.
We show how to efficiently evaluate this formula using a Krylov subspace and a low-rank approximation.
arXiv Detail & Related papers (2021-08-02T13:38:09Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Implicit differentiation of Lasso-type models for hyperparameter
optimization [82.73138686390514]
We introduce an efficient implicit differentiation algorithm, without matrix inversion, tailored for Lasso-type problems.
Our approach scales to high-dimensional data by leveraging the sparsity of the solutions.
arXiv Detail & Related papers (2020-02-20T18:43:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.