Weak SINDy For Partial Differential Equations
- URL: http://arxiv.org/abs/2007.02848v3
- Date: Mon, 21 Dec 2020 17:26:50 GMT
- Title: Weak SINDy For Partial Differential Equations
- Authors: Daniel A. Messenger and David M. Bortz
- Abstract summary: We extend our Weak SINDy (WSINDy) framework to the setting of partial differential equations (PDEs)
The elimination of pointwise derivative approximations via the weak form enables effective machine-precision recovery of model coefficients from noise-free data.
We demonstrate WSINDy's robustness, speed and accuracy on several challenging PDEs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sparse Identification of Nonlinear Dynamics (SINDy) is a method of system
discovery that has been shown to successfully recover governing dynamical
systems from data (Brunton et al., PNAS, '16; Rudy et al., Sci. Adv. '17).
Recently, several groups have independently discovered that the weak
formulation provides orders of magnitude better robustness to noise. Here we
extend our Weak SINDy (WSINDy) framework introduced in (arXiv:2005.04339) to
the setting of partial differential equations (PDEs). The elimination of
pointwise derivative approximations via the weak form enables effective
machine-precision recovery of model coefficients from noise-free data (i.e.
below the tolerance of the simulation scheme) as well as robust identification
of PDEs in the large noise regime (with signal-to-noise ratio approaching one
in many well-known cases). This is accomplished by discretizing a convolutional
weak form of the PDE and exploiting separability of test functions for
efficient model identification using the Fast Fourier Transform. The resulting
WSINDy algorithm for PDEs has a worst-case computational complexity of
$\mathcal{O}(N^{D+1}\log(N))$ for datasets with $N$ points in each of $D+1$
dimensions (i.e. $\mathcal{O}(\log(N))$ operations per datapoint). Furthermore,
our Fourier-based implementation reveals a connection between robustness to
noise and the spectra of test functions, which we utilize in an \textit{a
priori} selection algorithm for test functions. Finally, we introduce a
learning algorithm for the threshold in sequential-thresholding least-squares
(STLS) that enables model identification from large libraries, and we utilize
scale-invariance at the continuum level to identify PDEs from poorly-scaled
datasets. We demonstrate WSINDy's robustness, speed and accuracy on several
challenging PDEs.
Related papers
- Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Physics-informed AI and ML-based sparse system identification algorithm for discovery of PDE's representing nonlinear dynamic systems [0.0]
The proposed method is demonstrated to discover various differential equations at various noise levels, including three-dimensional, fourth-order, and stiff equations.
The parameter estimation converges accurately to the true values with a small coefficient of variation, suggesting robustness to the noise.
arXiv Detail & Related papers (2024-10-13T21:48:51Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Automating the Discovery of Partial Differential Equations in Dynamical Systems [0.0]
We present an extension to the ARGOS framework, ARGOS-RAL, which leverages sparse regression with the recurrent adaptive lasso to identify PDEs automatically.
We rigorously evaluate the performance of ARGOS-RAL in identifying canonical PDEs under various noise levels and sample sizes.
Our results show that ARGOS-RAL effectively and reliably identifies the underlying PDEs from data, outperforming the sequential threshold ridge regression method in most cases.
arXiv Detail & Related papers (2024-04-25T09:23:03Z) - Weak Collocation Regression for Inferring Stochastic Dynamics with
L\'{e}vy Noise [8.15076267771005]
We propose a weak form of the Fokker-Planck (FP) equation for extracting dynamics with L'evy noise.
Our approach can simultaneously distinguish mixed noise types, even in multi-dimensional problems.
arXiv Detail & Related papers (2024-03-13T06:54:38Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - Deep-learning based discovery of partial differential equations in
integral form from sparse and noisy data [2.745859263816099]
A new framework combining deep-learning and integral form is proposed to handle the above-mentioned problems simultaneously.
Our proposed algorithm is more robust to noise and more accurate compared with existing methods due to the utilization of integral form.
arXiv Detail & Related papers (2020-11-24T09:18:39Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Deep-learning of Parametric Partial Differential Equations from Sparse
and Noisy Data [2.4431531175170362]
In this work, a new framework, which combines neural network, genetic algorithm and adaptive methods, is put forward to address all of these challenges simultaneously.
A trained neural network is utilized to calculate derivatives and generate a large amount of meta-data, which solves the problem of sparse noisy data.
Next, genetic algorithm is utilized to discover the form of PDEs and corresponding coefficients with an incomplete candidate library.
A two-step adaptive method is introduced to discover parametric PDEs with spatially- or temporally-varying coefficients.
arXiv Detail & Related papers (2020-05-16T09:09:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.