Inference of hidden common driver dynamics by anisotropic self-organizing neural networks
- URL: http://arxiv.org/abs/2504.01811v1
- Date: Wed, 02 Apr 2025 15:17:23 GMT
- Title: Inference of hidden common driver dynamics by anisotropic self-organizing neural networks
- Authors: Zsigmond Benkő, Marcell Stippinger, Zoltán Somogyvári,
- Abstract summary: We introduce a novel approach to infer the underlying dynamics of hidden common drivers.<n>The inference relies on time-delay embedding, estimation of the intrinsic dimension of the observed systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We are introducing a novel approach to infer the underlying dynamics of hidden common drivers, based on analyzing time series data from two driven dynamical systems. The inference relies on time-delay embedding, estimation of the intrinsic dimension of the observed systems, and their mutual dimension. A key component of our approach is a new anisotropic training technique applied to Kohonen's self-organizing map, which effectively learns the attractor of the driven system and separates it into submanifolds corresponding to the self-dynamics and shared dynamics. To demonstrate the effectiveness of our method, we conducted simulated experiments using different chaotic maps in a setup, where two chaotic maps were driven by a third map with nonlinear coupling. The inferred time series exhibited high correlation with the time series of the actual hidden common driver, in contrast to the observed systems. The quality of our reconstruction were compared and shown to be superior to several other methods that are intended to find the common features behind the observed time series, including linear methods like PCA and ICA as well as nonlinear methods like dynamical component analysis, canonical correlation analysis and even deep canonical correlation analysis.
Related papers
- Communities in the Kuramoto Model: Dynamics and Detection via Path Signatures [1.024113475677323]
We propose a mathematical framework that encodes geometric and temporal properties of continuous paths to address this problem.<n>Path signatures provide a reparametrization-invariant characterization of dynamical data.<n>We propose a novel signature-based community detection algorithm, achieving exact recovery of structural communities from observed time series.
arXiv Detail & Related papers (2025-03-21T21:41:48Z) - Machine learning approach to detect dynamical states from recurrence measures [0.0]
We implement three machine learning algorithms Logistic Regression, Random Forest, and Support Vector Machine for this study.
For training and testing we generate synthetic data from standard nonlinear dynamical systems.
We illustrate how the trained algorithms can successfully predict the dynamical states of two variable stars, SX Her and AC Her.
arXiv Detail & Related papers (2024-01-18T05:02:36Z) - Deep Learning-based Analysis of Basins of Attraction [49.812879456944984]
This research addresses the challenge of characterizing the complexity and unpredictability of basins within various dynamical systems.
The main focus is on demonstrating the efficiency of convolutional neural networks (CNNs) in this field.
arXiv Detail & Related papers (2023-09-27T15:41:12Z) - Beyond Geometry: Comparing the Temporal Structure of Computation in
Neural Circuits with Dynamical Similarity Analysis [7.660368798066376]
We introduce a novel similarity metric that compares two systems at the level of their dynamics.
Our method opens the door to comparative analyses of the essential temporal structure of computation in neural circuits.
arXiv Detail & Related papers (2023-06-16T20:11:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Initial Correlations in Open Quantum Systems: Constructing Linear
Dynamical Maps and Master Equations [62.997667081978825]
We show that, for any predetermined initial correlations, one can introduce a linear dynamical map on the space of operators of the open system.
We demonstrate that this construction leads to a linear, time-local quantum master equation with generalized Lindblad structure.
arXiv Detail & Related papers (2022-10-24T13:43:04Z) - Decomposed Linear Dynamical Systems (dLDS) for learning the latent
components of neural dynamics [6.829711787905569]
We propose a new decomposed dynamical system model that represents complex non-stationary and nonlinear dynamics of time series data.
Our model is trained through a dictionary learning procedure, where we leverage recent results in tracking sparse vectors over time.
In both continuous-time and discrete-time instructional examples we demonstrate that our model can well approximate the original system.
arXiv Detail & Related papers (2022-06-07T02:25:38Z) - Linearization and Identification of Multiple-Attractors Dynamical System
through Laplacian Eigenmaps [8.161497377142584]
We propose a Graph-based spectral clustering method that takes advantage of a velocity-augmented kernel to connect data-points belonging to the same dynamics.
We prove that there always exist a set of 2-dimensional embedding spaces in which the sub-dynamics are linear, and n-dimensional embedding where they are quasi-linear.
We learn a diffeomorphism from the Laplacian embedding space to the original space and show that the Laplacian embedding leads to good reconstruction accuracy and a faster training time.
arXiv Detail & Related papers (2022-02-18T12:43:25Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.