A new local time-decoupled squared Wasserstein-2 method for training stochastic neural networks to reconstruct uncertain parameters in dynamical systems
- URL: http://arxiv.org/abs/2503.05068v1
- Date: Fri, 07 Mar 2025 01:20:43 GMT
- Title: A new local time-decoupled squared Wasserstein-2 method for training stochastic neural networks to reconstruct uncertain parameters in dynamical systems
- Authors: Mingtao Xia, Qijing Shen, Philip Maini, Eamonn Gaffney, Alex Mogilner,
- Abstract summary: We show that a network neural model can be effectively trained by minimizing our proposed local time-decoupled squared Wasserstein-2 loss function.<n>We showcase the effectiveness of our proposed method in reconstructing the distribution of parameters in different dynamical systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we propose and analyze a new local time-decoupled squared Wasserstein-2 method for reconstructing the distribution of unknown parameters in dynamical systems. Specifically, we show that a stochastic neural network model, which can be effectively trained by minimizing our proposed local time-decoupled squared Wasserstein-2 loss function, is an effective model for approximating the distribution of uncertain model parameters in dynamical systems. Through several numerical examples, we showcase the effectiveness of our proposed method in reconstructing the distribution of parameters in different dynamical systems.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - A local squared Wasserstein-2 method for efficient reconstruction of models with uncertainty [0.0]
We propose a local squared Wasserstein-2 (W_2) method to solve the inverse problem of reconstructing models with uncertain latent variables or parameters.
A key advantage of our approach is that it does not require prior information on the distribution of the latent variables or parameters in the underlying models.
arXiv Detail & Related papers (2024-06-10T22:15:55Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Parameter Identification for Partial Differential Equations with
Spatiotemporal Varying Coefficients [5.373009527854677]
We propose a framework that facilitates the investigation of parameter identification for multi-state systems governed by varying partial differential equations.
Our framework consists of two integral components: a constrained self-adaptive neural network, and a sub-network physics-informed neural network.
We have showcased the efficacy of our framework on two numerical cases: the 1D Burgers' cases with time-varying parameters and the 2 wave equation with a space-varying parameter.
arXiv Detail & Related papers (2023-06-30T07:17:19Z) - Learning Nonautonomous Systems via Dynamic Mode Decomposition [0.0]
We present a data-driven learning approach for unknown nonautonomous dynamical systems with time-dependent inputs based on dynamic mode decomposition (DMD)
To circumvent the difficulty of approximating the time-dependent Koopman operators for nonautonomous systems, a modified system is employed as an approximation to the original nonautonomous system.
arXiv Detail & Related papers (2023-06-27T16:58:26Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Reduced order modeling of parametrized systems through autoencoders and
SINDy approach: continuation of periodic solutions [0.0]
This work presents a data-driven, non-intrusive framework which combines ROM construction with reduced dynamics identification.
The proposed approach leverages autoencoder neural networks with parametric sparse identification of nonlinear dynamics (SINDy) to construct a low-dimensional dynamical model.
These aim at tracking the evolution of periodic steady-state responses as functions of system parameters, avoiding the computation of the transient phase, and allowing to detect instabilities and bifurcations.
arXiv Detail & Related papers (2022-11-13T01:57:18Z) - Time varying regression with hidden linear dynamics [74.9914602730208]
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system.
Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be estimated from data by combining just two ordinary least squares estimates.
arXiv Detail & Related papers (2021-12-29T23:37:06Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Data-driven learning of non-autonomous systems [4.459185142332526]
We present a numerical framework for recovering unknown non-autonomous dynamical systems with time-dependent inputs.
To circumvent the difficulty presented by the non-autonomous nature of the system, our method transforms the solution state into piecewise of the system over a discrete set of time instances.
arXiv Detail & Related papers (2020-06-02T15:33:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.