Sparse identification of quasipotentials via a combined data-driven method
- URL: http://arxiv.org/abs/2407.05050v2
- Date: Mon, 10 Nov 2025 13:05:50 GMT
- Title: Sparse identification of quasipotentials via a combined data-driven method
- Authors: Bo Lin, Pierpaolo Belardinelli,
- Abstract summary: We show how to discover parsimonious equations for the quasipotential directly from data.<n>We use a neural network and a sparse regression algorithm, specifically designed to symbolically describe multistable energy landscapes.<n>Our model-unbiased analytical forms of the quasipotential is of interest to a wide range of applications aimed at assessing metastability and energy landscapes.
- Score: 8.508437491732954
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The quasipotential function allows for comprehension and prediction of the escape mechanisms from metastable states in nonlinear dynamical systems. This function acts as a natural extension of the potential function for non-gradient systems and it unveils important properties such as the maximum likelihood transition paths, transition rates and expected exit times of the system. Here, we demonstrate how to discover parsimonious equations for the quasipotential directly from data. Leveraging machine learning, we combine two existing data-driven techniques, namely a neural network and a sparse regression algorithm, specifically designed to symbolically describe multistable energy landscapes. First, we employ a vanilla neural network enhanced with a renormalization and rescaling procedure to achieve an orthogonal decomposition of the vector field. Next, we apply symbolic regression to extract the downhill and circulatory components of the decomposition, ensuring consistency with the underlying dynamics. This symbolic reconstruction involves a simultaneous regression that imposes constraints on both the orthogonality condition and the vector field. We implement and benchmark our approach using an archetypal model with a known exact quasipotential, as well as a nanomechanical resonator system. We further demonstrate its applicability to noisy data and to a four-dimensional system. Our model-unbiased analytical forms of the quasipotential is of interest to a wide range of applications aimed at assessing metastability and energy landscapes, serving to parametrically capture the distinctive fingerprint of the fluctuating dynamics.
Related papers
- Disordered Dynamics in High Dimensions: Connections to Random Matrices and Machine Learning [52.26396748560348]
We provide an overview of high dimensional dynamical systems driven by random matrices.<n>We focus on applications to simple models of learning and generalization in machine learning theory.
arXiv Detail & Related papers (2026-01-03T00:12:32Z) - From Propagator to Oscillator: The Dual Role of Symmetric Differential Equations in Neural Systems [2.684545081600664]
We study the dynamics and functional diversity of a novel neuron model based on symmetric differential equations.<n>The model exhibits two distinct trajectory behaviors: one is intrinsicly stable, corresponding to a reliable signal propagator; the other is Lyapunov stable, characterized by sustained self-excited oscillations.<n>These findings draw a compelling parallel to the dual roles of biological neurons in both information transmission and rhythm generation.
arXiv Detail & Related papers (2025-07-20T15:34:47Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Learning mechanical systems from real-world data using discrete forced Lagrangian dynamics [0.0]
We introduce a data-driven method for learning the equations of motion of mechanical systems directly from position measurements.<n>This is particularly relevant in system identification tasks where only positional information is available, such as motion capture, pixel data or low-resolution tracking.
arXiv Detail & Related papers (2025-05-26T12:13:00Z) - The Spectral Bias of Shallow Neural Network Learning is Shaped by the Choice of Non-linearity [0.7499722271664144]
We study how non-linear activation functions contribute to shaping neural networks' implicit bias.
We show that local dynamical attractors facilitate the formation of clusters of hyperplanes where the input to a neuron's activation function is zero.
arXiv Detail & Related papers (2025-03-13T17:36:46Z) - Identifiable Representation and Model Learning for Latent Dynamic Systems [0.0]
We study the problem of identifiable representation and model learning for latent dynamic systems.
We prove that, for linear and affine nonlinear latent dynamic systems with sparse input matrices, it is possible to identify the latent variables up to scaling.
arXiv Detail & Related papers (2024-10-23T13:55:42Z) - Quasi-potential and drift decomposition in stochastic systems by sparse identification [0.0]
The quasi-potential is a key concept in systems as it accounts for the long-term behavior of the dynamics of such systems.
This paper combines a sparse learning technique with action minimization methods in order to determine the quasi-potential.
We implement the proposed approach in 2- and 3-D systems, covering various types of potential landscapes and attractors.
arXiv Detail & Related papers (2024-09-10T22:02:15Z) - Learning invariant representations of time-homogeneous stochastic dynamical systems [27.127773672738535]
We study the problem of learning a representation of the state that faithfully captures its dynamics.
This is instrumental to learning the transfer operator or the generator of the system.
We show that the search for a good representation can be cast as an optimization problem over neural networks.
arXiv Detail & Related papers (2023-07-19T11:32:24Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - A Data Driven Method for Computing Quasipotentials [8.055813148141246]
The quasipotential plays a central role in characterizing statistics of transition events and likely transition paths.
Traditional methods based on the dynamic programming principle or path space tend to suffer from the curse of dimensionality.
We show that our method can effectively compute quasipotential landscapes without requiring spatial discretization or solving path-space optimization problems.
arXiv Detail & Related papers (2020-12-13T02:32:49Z) - Gradient Starvation: A Learning Proclivity in Neural Networks [97.02382916372594]
Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task.
This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks.
arXiv Detail & Related papers (2020-11-18T18:52:08Z) - Stochastic embeddings of dynamical phenomena through variational
autoencoders [1.7205106391379026]
We use a recognition network to increase the observed space dimensionality during the reconstruction of the phase space.
Our validation shows that this approach not only recovers a state space that resembles the original one, but it is also able to synthetize new time series.
arXiv Detail & Related papers (2020-10-13T10:10:24Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.