Disordered Dynamics in High Dimensions: Connections to Random Matrices and Machine Learning
- URL: http://arxiv.org/abs/2601.01010v1
- Date: Sat, 03 Jan 2026 00:12:32 GMT
- Title: Disordered Dynamics in High Dimensions: Connections to Random Matrices and Machine Learning
- Authors: Blake Bordelon, Cengiz Pehlevan,
- Abstract summary: We provide an overview of high dimensional dynamical systems driven by random matrices.<n>We focus on applications to simple models of learning and generalization in machine learning theory.
- Score: 52.26396748560348
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We provide an overview of high dimensional dynamical systems driven by random matrices, focusing on applications to simple models of learning and generalization in machine learning theory. Using both cavity method arguments and path integrals, we review how the behavior of a coupled infinite dimensional system can be characterized as a stochastic process for each single site of the system. We provide a pedagogical treatment of dynamical mean field theory (DMFT), a framework that can be flexibly applied to these settings. The DMFT single site stochastic process is fully characterized by a set of (two-time) correlation and response functions. For linear time-invariant systems, we illustrate connections between random matrix resolvents and the DMFT response. We demonstrate applications of these ideas to machine learning models such as gradient flow, stochastic gradient descent on random feature models and deep linear networks in the feature learning regime trained on random data. We demonstrate how bias and variance decompositions (analysis of ensembling/bagging etc) can be computed by averaging over subsets of the DMFT noise variables. From our formalism we also investigate how linear systems driven with random non-Hermitian matrices (such as random feature models) can exhibit non-monotonic loss curves with training time, while Hermitian matrices with the matching spectra do not, highlighting a different mechanism for non-monotonicity than small eigenvalues causing instability to label noise. Lastly, we provide asymptotic descriptions of the training and test loss dynamics for randomly initialized deep linear neural networks trained in the feature learning regime with high-dimensional random data. In this case, the time translation invariance structure is lost and the hidden layer weights are characterized as spiked random matrices.
Related papers
- Learning Nonlinear Dynamics in Physical Modelling Synthesis using Neural Ordinary Differential Equations [13.755383470312001]
A modal decomposition leads to a coupled nonlinear system of ordinary differential equations.<n>Recent work in applied machine learning approaches has been used to model lumped dynamic systems automatically from data.<n>We show that the model can be trained to reproduce the nonlinear dynamics of the system.
arXiv Detail & Related papers (2025-05-15T17:17:21Z) - Learning Stochastic Dynamical Systems with Structured Noise [12.056775765064266]
Due to the availability of large-scale data sets, there is growing interest in learning models from observations with noise.<n>We present a nonparametric framework to learn both the drift and diffusion terms in systems of SDEs where the noise is singular.<n>We provide an algorithm for constructing estimators given trajectory data and demonstrate the effectiveness of our methods.
arXiv Detail & Related papers (2025-03-03T00:40:53Z) - Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - HyperSINDy: Deep Generative Modeling of Nonlinear Stochastic Governing
Equations [5.279268784803583]
We introduce HyperSINDy, a framework for modeling dynamics via a deep generative model of sparse governing equations from data.
Once trained, HyperSINDy generates dynamics via a differential equation whose coefficients are driven by a white noise.
In experiments, HyperSINDy recovers ground truth governing equations, with learnedity scaling to match that of the data.
arXiv Detail & Related papers (2023-10-07T14:41:59Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Formal Controller Synthesis for Markov Jump Linear Systems with
Uncertain Dynamics [64.72260320446158]
We propose a method for synthesising controllers for Markov jump linear systems.
Our method is based on a finite-state abstraction that captures both the discrete (mode-jumping) and continuous (stochastic linear) behaviour of the MJLS.
We apply our method to multiple realistic benchmark problems, in particular, a temperature control and an aerial vehicle delivery problem.
arXiv Detail & Related papers (2022-12-01T17:36:30Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Uncertainty quantification in a mechanical submodel driven by a
Wasserstein-GAN [0.0]
We show that the use of non-linear techniques in machine learning and data-driven methods is highly relevant.
Generative Adversarial Networks (GANs) are suited for such applications, where the Wasserstein-GAN with gradient penalty variant offers improved results.
arXiv Detail & Related papers (2021-10-26T13:18:06Z) - Fluctuation-dissipation Type Theorem in Stochastic Linear Learning [2.8292841621378844]
The fluctuation-dissipation theorem (FDT) is a simple yet powerful consequence of the first-order differential equation governing the dynamics of systems subject simultaneously to dissipative and forces.
The linear learning dynamics, in which the input vector maps to the output vector by a linear matrix whose elements are the subject of learning, has a verify version closely mimicking the Langevin dynamics when a full-batch gradient descent scheme is replaced by that of gradient descent.
We derive a generalized verify for the linear learning dynamics and its validity among the well-known machine learning data sets such as MNIST, CIFAR-10 and
arXiv Detail & Related papers (2021-06-04T02:54:26Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Learning with Density Matrices and Random Features [44.98964870180375]
A density matrix describes the statistical state of a quantum system.
It is a powerful formalism to represent both the quantum and classical uncertainty of quantum systems.
This paper explores how density matrices can be used as a building block for machine learning models.
arXiv Detail & Related papers (2021-02-08T17:54:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.