Machine learning identifies nullclines in oscillatory dynamical systems
- URL: http://arxiv.org/abs/2503.16240v1
- Date: Thu, 20 Mar 2025 15:37:39 GMT
- Title: Machine learning identifies nullclines in oscillatory dynamical systems
- Authors: Bartosz Prokop, Jimmy Billen, Nikita Frolov, Lendert Gelens,
- Abstract summary: We introduce CLINE, a neural network-based method that uncovers the hidden structure of nullclines from time series data.<n>It overcomes challenges such as multiple time scales and strong nonlinearities while producing interpretable results convertible into symbolic differential equations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce CLINE (Computational Learning and Identification of Nullclines), a neural network-based method that uncovers the hidden structure of nullclines from oscillatory time series data. Unlike traditional approaches aiming at direct prediction of system dynamics, CLINE identifies static geometric features of the phase space that encode the (non)linear relationships between state variables. It overcomes challenges such as multiple time scales and strong nonlinearities while producing interpretable results convertible into symbolic differential equations. We validate CLINE on various oscillatory systems, showcasing its effectiveness.
Related papers
- Neural Contraction Metrics with Formal Guarantees for Discrete-Time Nonlinear Dynamical Systems [17.905596843865705]
Contraction metrics provide a powerful framework for analyzing stability, robustness, and convergence of various dynamical systems.
However, identifying these metrics for complex nonlinear systems remains an open challenge due to the lack of effective tools.
This paper develops verifiable contraction metrics for discrete scalable nonlinear systems.
arXiv Detail & Related papers (2025-04-23T21:27:32Z) - Generative System Dynamics in Recurrent Neural Networks [56.958984970518564]
We investigate the continuous time dynamics of Recurrent Neural Networks (RNNs)
We show that skew-symmetric weight matrices are fundamental to enable stable limit cycles in both linear and nonlinear configurations.
Numerical simulations showcase how nonlinear activation functions not only maintain limit cycles, but also enhance the numerical stability of the system integration process.
arXiv Detail & Related papers (2025-04-16T10:39:43Z) - Identification For Control Based on Neural Networks: Approximately Linearizable Models [42.15267357325546]
This work presents a control-oriented identification scheme for efficient control design and stability analysis of nonlinear systems.
Neural networks are used to identify a discrete-time nonlinear state-space model to approximate time-domain input-output behavior.
The network is constructed such that the identified model is approximately linearizable by feedback, ensuring that the control law trivially follows from the learning stage.
arXiv Detail & Related papers (2024-09-24T08:31:22Z) - Machine learning approach to detect dynamical states from recurrence measures [0.0]
We implement three machine learning algorithms Logistic Regression, Random Forest, and Support Vector Machine for this study.
For training and testing we generate synthetic data from standard nonlinear dynamical systems.
We illustrate how the trained algorithms can successfully predict the dynamical states of two variable stars, SX Her and AC Her.
arXiv Detail & Related papers (2024-01-18T05:02:36Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - Learning Control-Oriented Dynamical Structure from Data [25.316358215670274]
We discuss a state-dependent nonlinear tracking controller formulation for general nonlinear control-affine systems.
We empirically demonstrate the efficacy of learned versions of this controller in stable trajectory tracking.
arXiv Detail & Related papers (2023-02-06T02:01:38Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Active Learning for Nonlinear System Identification with Guarantees [102.43355665393067]
We study a class of nonlinear dynamical systems whose state transitions depend linearly on a known feature embedding of state-action pairs.
We propose an active learning approach that achieves this by repeating three steps: trajectory planning, trajectory tracking, and re-estimation of the system from all available data.
We show that our method estimates nonlinear dynamical systems at a parametric rate, similar to the statistical rate of standard linear regression.
arXiv Detail & Related papers (2020-06-18T04:54:11Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.