Weak Collocation Regression method: fast reveal hidden stochastic
dynamics from high-dimensional aggregate data
- URL: http://arxiv.org/abs/2209.02628v3
- Date: Thu, 1 Feb 2024 12:00:48 GMT
- Title: Weak Collocation Regression method: fast reveal hidden stochastic
dynamics from high-dimensional aggregate data
- Authors: Liwei Lu, Zhijun Zeng, Yan Jiang, Yi Zhu, and Pipi Hu
- Abstract summary: We show an approach to effectively modeling the dynamics of the data without trajectories based on the weak form of the Fokker-Planck (FP) equation.
We name the method with the Weak Collocation Regression (WCR) method for its three key components: weak form, collocation of Gaussian kernels, and regression.
- Score: 10.195461345970541
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Revealing hidden dynamics from the stochastic data is a challenging problem
as randomness takes part in the evolution of the data. The problem becomes
exceedingly complex when the trajectories of the stochastic data are absent in
many scenarios. Here we present an approach to effectively modeling the
dynamics of the stochastic data without trajectories based on the weak form of
the Fokker-Planck (FP) equation, which governs the evolution of the density
function in the Brownian process. Taking the collocations of Gaussian functions
as the test functions in the weak form of the FP equation, we transfer the
derivatives to the Gaussian functions and thus approximate the weak form by the
expectational sum of the data. With a dictionary representation of the unknown
terms, a linear system is built and then solved by the regression, revealing
the unknown dynamics of the data. Hence, we name the method with the Weak
Collocation Regression (WCR) method for its three key components: weak form,
collocation of Gaussian kernels, and regression. The numerical experiments show
that our method is flexible and fast, which reveals the dynamics within seconds
in multi-dimensional problems and can be easily extended to high-dimensional
data such as 20 dimensions. WCR can also correctly identify the hidden dynamics
of the complex tasks with variable-dependent diffusion and coupled drift, and
the performance is robust, achieving high accuracy in the case with noise
added.
Related papers
- An evolutionary approach for discovering non-Gaussian stochastic dynamical systems based on nonlocal Kramers-Moyal formulas [19.588387760309484]
This research endeavors to develop an evolutionary symbol sparse regression (ESSR) approach to extract non-Gaussian dynamical systems from sample path data.
The efficacy and capabilities of this approach are showcased through its application to several illustrative models.
arXiv Detail & Related papers (2024-09-29T03:35:01Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Weak Collocation Regression for Inferring Stochastic Dynamics with
L\'{e}vy Noise [8.15076267771005]
We propose a weak form of the Fokker-Planck (FP) equation for extracting dynamics with L'evy noise.
Our approach can simultaneously distinguish mixed noise types, even in multi-dimensional problems.
arXiv Detail & Related papers (2024-03-13T06:54:38Z) - Gradient-Based Feature Learning under Structured Data [57.76552698981579]
In the anisotropic setting, the commonly used spherical gradient dynamics may fail to recover the true direction.
We show that appropriate weight normalization that is reminiscent of batch normalization can alleviate this issue.
In particular, under the spiked model with a suitably large spike, the sample complexity of gradient-based training can be made independent of the information exponent.
arXiv Detail & Related papers (2023-09-07T16:55:50Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Extracting stochastic dynamical systems with $\alpha$-stable L\'evy
noise from data [14.230182518492311]
We propose a data-driven method to extract systems with $alpha$-stable L'evy noise from short burst data.
More specifically, we first estimate the L'evy jump measure and noise intensity.
Then we approximate the drift coefficient by combining nonlocal Kramers-Moyal formulas with normalizing flows.
arXiv Detail & Related papers (2021-09-30T06:57:42Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - A Data-Driven Approach for Discovering Stochastic Dynamical Systems with
Non-Gaussian Levy Noise [5.17900889163564]
We develop a new data-driven approach to extract governing laws from noisy data sets.
First, we establish a feasible theoretical framework, by expressing the drift coefficient, diffusion coefficient and jump measure.
We then design a numerical algorithm to compute the drift, diffusion coefficient and jump measure, and thus extract a governing equation with Gaussian and non-Gaussian noise.
arXiv Detail & Related papers (2020-05-07T21:29:17Z) - Learning Stochastic Behaviour from Aggregate Data [52.012857267317784]
Learning nonlinear dynamics from aggregate data is a challenging problem because the full trajectory of each individual is not available.
We propose a novel method using the weak form of Fokker Planck Equation (FPE) to describe the density evolution of data in a sampled form.
In such a sample-based framework we are able to learn the nonlinear dynamics from aggregate data without explicitly solving the partial differential equation (PDE) FPE.
arXiv Detail & Related papers (2020-02-10T03:20:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.