Data-driven prediction of multistable systems from sparse measurements
- URL: http://arxiv.org/abs/2010.14706v2
- Date: Wed, 12 May 2021 01:15:33 GMT
- Title: Data-driven prediction of multistable systems from sparse measurements
- Authors: Bryan Chu and Mohammad Farazmand
- Abstract summary: We develop a data-driven method, based on semi-supervised classification, to predict the state of multistable systems.
We introduce a sparsity-promoting metric-learning (SPML) optimization, which learns a metric directly from the precomputed data.
We demonstrate the application of this method on two multistable systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We develop a data-driven method, based on semi-supervised classification, to
predict the asymptotic state of multistable systems when only sparse spatial
measurements of the system are feasible. Our method predicts the asymptotic
behavior of an observed state by quantifying its proximity to the states in a
precomputed library of data. To quantify this proximity, we introduce a
sparsity-promoting metric-learning (SPML) optimization, which learns a metric
directly from the precomputed data. The optimization problem is designed so
that the resulting optimal metric satisfies two important properties: (i) It is
compatible with the precomputed library, and (ii) It is computable from sparse
measurements. We prove that the proposed SPML optimization is convex, its
minimizer is non-degenerate, and it is equivariant with respect to scaling of
the constraints. We demonstrate the application of this method on two
multistable systems: a reaction-diffusion equation, arising in pattern
formation, which has four asymptotically stable steady states and a
FitzHugh-Nagumo model with two asymptotically stable steady states.
Classifications of the multistable reaction-diffusion equation based on SPML
predict the asymptotic behavior of initial conditions based on two-point
measurements with 95% accuracy when moderate number of labeled data are used.
For the FitzHugh-Nagumo, SPML predicts the asymptotic behavior of initial
conditions from one-point measurements with 90% accuracy. The learned optimal
metric also determines where the measurements need to be made to ensure
accurate predictions.
Related papers
- Bayesian Estimation and Tuning-Free Rank Detection for Probability Mass Function Tensors [17.640500920466984]
This paper presents a novel framework for estimating the joint PMF and automatically inferring its rank from observed data.
We derive a deterministic solution based on variational inference (VI) to approximate the posterior distributions of various model parameters. Additionally, we develop a scalable version of the VI-based approach by leveraging variational inference (SVI)
Experiments involving both synthetic data and real movie recommendation data illustrate the advantages of our VI and SVI-based methods in terms of estimation accuracy, automatic rank detection, and computational efficiency.
arXiv Detail & Related papers (2024-10-08T20:07:49Z) - Measuring Stochastic Data Complexity with Boltzmann Influence Functions [12.501336941823627]
Estimating uncertainty of a model's prediction on a test point is a crucial part of ensuring reliability and calibration under distribution shifts.
We propose IF-COMP, a scalable and efficient approximation of the pNML distribution that linearizes the model with a temperature-scaled Boltzmann influence function.
We experimentally validate IF-COMP on uncertainty calibration, mislabel detection, and OOD detection tasks, where it consistently matches or beats strong baseline methods.
arXiv Detail & Related papers (2024-06-04T20:01:39Z) - MESSY Estimation: Maximum-Entropy based Stochastic and Symbolic densitY
Estimation [4.014524824655106]
MESSY estimation is a Maximum-Entropy based Gradient and Symbolic densitY estimation method.
We construct a gradient-based drift-diffusion process that connects samples of the unknown distribution function to a guess symbolic expression.
We find that the addition of a symbolic search for basis functions improves the accuracy of the estimation at a reasonable additional computational cost.
arXiv Detail & Related papers (2023-06-07T03:28:47Z) - Numerically Stable Sparse Gaussian Processes via Minimum Separation
using Cover Trees [57.67528738886731]
We study the numerical stability of scalable sparse approximations based on inducing points.
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
arXiv Detail & Related papers (2022-10-14T15:20:17Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - Off-policy estimation of linear functionals: Non-asymptotic theory for
semi-parametric efficiency [59.48096489854697]
The problem of estimating a linear functional based on observational data is canonical in both the causal inference and bandit literatures.
We prove non-asymptotic upper bounds on the mean-squared error of such procedures.
We establish its instance-dependent optimality in finite samples via matching non-asymptotic local minimax lower bounds.
arXiv Detail & Related papers (2022-09-26T23:50:55Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Mean-squared-error-based adaptive estimation of pure quantum states and
unitary transformations [0.0]
We propose a method to estimate with high accuracy pure quantum states of a single qudit.
Our method is based on the minimization of the squared error between the complex probability amplitudes of the unknown state and its estimate.
We show that our estimation procedure can be easily extended to estimate unknown unitary transformations acting on a single qudit.
arXiv Detail & Related papers (2020-08-23T00:32:10Z) - Manifold-adaptive dimension estimation revisited [0.0]
We revisit and improve the manifold-adaptive Farahmand-Szepesv'ari-Audibert dimension estimator.
We compute the probability density function of local FSA estimates.
We derive the maximum likelihood formula for global intrinsic dimensionality.
arXiv Detail & Related papers (2020-08-07T15:27:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.