Towards stable real-world equation discovery with assessing
differentiating quality influence
- URL: http://arxiv.org/abs/2311.05787v1
- Date: Thu, 9 Nov 2023 23:32:06 GMT
- Title: Towards stable real-world equation discovery with assessing
differentiating quality influence
- Authors: Mikhail Masliaev, Ilya Markov, Alexander Hvatov
- Abstract summary: We propose alternatives to the commonly used finite differences-based method.
We evaluate these methods in terms of applicability to problems, similar to the real ones, and their ability to ensure the convergence of equation discovery algorithms.
- Score: 52.2980614912553
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper explores the critical role of differentiation approaches for
data-driven differential equation discovery. Accurate derivatives of the input
data are essential for reliable algorithmic operation, particularly in
real-world scenarios where measurement quality is inevitably compromised. We
propose alternatives to the commonly used finite differences-based method,
notorious for its instability in the presence of noise, which can exacerbate
random errors in the data. Our analysis covers four distinct methods:
Savitzky-Golay filtering, spectral differentiation, smoothing based on
artificial neural networks, and the regularization of derivative variation. We
evaluate these methods in terms of applicability to problems, similar to the
real ones, and their ability to ensure the convergence of equation discovery
algorithms, providing valuable insights for robust modeling of real-world
processes.
Related papers
- Robust identifiability for symbolic recovery of differential equations [14.08907045605149]
This paper investigates how noise influences the uniqueness and identifiability of physical laws governed by partial differential equations (PDEs)
We introduce new algorithms that account for noise, providing thresholds to assess uniqueness and identifying situations where excessive noise hinders reliable conclusions.
arXiv Detail & Related papers (2024-10-13T17:45:14Z) - Sparse discovery of differential equations based on multi-fidelity
Gaussian process [0.8088384541966945]
Sparse identification of differential equations aims to compute the analytic expressions from the observed data explicitly.
It exhibits sensitivity to the noise in the observed data, particularly for the derivatives computations.
Existing literature predominantly concentrates on single-fidelity (SF) data, which imposes limitations on its applicability.
We present two novel approaches to address these problems from the view of uncertainty quantification.
arXiv Detail & Related papers (2024-01-22T10:38:14Z) - Learning Correspondence Uncertainty via Differentiable Nonlinear Least
Squares [47.83169780113135]
We propose a differentiable nonlinear least squares framework to account for uncertainty in relative pose estimation from feature correspondences.
We evaluate our approach on synthetic, as well as the KITTI and EuRoC real-world datasets.
arXiv Detail & Related papers (2023-05-16T15:21:09Z) - Nonparametric learning of kernels in nonlocal operators [6.314604944530131]
We provide a rigorous identifiability analysis and convergence study for the learning of kernels in nonlocal operators.
We propose a nonparametric regression algorithm with a novel data adaptive RKHS Tikhonov regularization method based on the function space of identifiability.
arXiv Detail & Related papers (2022-05-23T02:47:55Z) - Distributionally Robust Learning with Stable Adversarial Training [34.74504615726101]
Machine learning algorithms with empirical risk minimization are vulnerable under distributional shifts.
We propose a novel Stable Adversarial Learning (SAL) algorithm that leverages heterogeneous data sources to construct a more practical uncertainty set.
arXiv Detail & Related papers (2021-06-30T03:05:45Z) - BayesIMP: Uncertainty Quantification for Causal Data Fusion [52.184885680729224]
We study the causal data fusion problem, where datasets pertaining to multiple causal graphs are combined to estimate the average treatment effect of a target variable.
We introduce a framework which combines ideas from probabilistic integration and kernel mean embeddings to represent interventional distributions in the reproducing kernel Hilbert space.
arXiv Detail & Related papers (2021-06-07T10:14:18Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Accounting for Unobserved Confounding in Domain Generalization [107.0464488046289]
This paper investigates the problem of learning robust, generalizable prediction models from a combination of datasets.
Part of the challenge of learning robust models lies in the influence of unobserved confounders.
We demonstrate the empirical performance of our approach on healthcare data from different modalities.
arXiv Detail & Related papers (2020-07-21T08:18:06Z) - Learning while Respecting Privacy and Robustness to Distributional
Uncertainties and Adversarial Data [66.78671826743884]
The distributionally robust optimization framework is considered for training a parametric model.
The objective is to endow the trained model with robustness against adversarially manipulated input data.
Proposed algorithms offer robustness with little overhead.
arXiv Detail & Related papers (2020-07-07T18:25:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.