Geometric Learning Dynamics
- URL: http://arxiv.org/abs/2504.14728v1
- Date: Sun, 20 Apr 2025 19:56:41 GMT
- Title: Geometric Learning Dynamics
- Authors: Vitaly Vanchurin,
- Abstract summary: We present a unified framework for modeling learning dynamics in physical, biological, and machine learning systems.<n>The quantum regime corresponds to $a = 1$ and describes Schr"odinger-like dynamics that emerges from a discrete shift symmetry.<n>The efficient learning regime corresponds to $a = tfrac12$ and describes very fast machine learning algorithms.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a unified geometric framework for modeling learning dynamics in physical, biological, and machine learning systems. The theory reveals three fundamental regimes, each emerging from the power-law relationship $g \propto \kappa^a$ between the metric tensor $g$ in the space of trainable variables and the noise covariance matrix $\kappa$. The quantum regime corresponds to $a = 1$ and describes Schr\"odinger-like dynamics that emerges from a discrete shift symmetry. The efficient learning regime corresponds to $a = \tfrac{1}{2}$ and describes very fast machine learning algorithms. The equilibration regime corresponds to $a = 0$ and describes classical models of biological evolution. We argue that the emergence of the intermediate regime $a = \tfrac{1}{2}$ is a key mechanism underlying the emergence of biological complexity.
Related papers
- Correspondence between open bosonic systems and stochastic differential
equations [77.34726150561087]
We show that there can also be an exact correspondence at finite $n$ when the bosonic system is generalized to include interactions with the environment.
A particular system with the form of a discrete nonlinear Schr"odinger equation is analyzed in more detail.
arXiv Detail & Related papers (2023-02-03T19:17:37Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Entanglement Dynamics in Anti-$\mathcal{PT}$-Symmetric Systems [2.5087808172987187]
entanglement dynamics in $mathcalAPT$-symmetric systems has not previously been investigated in both theory and experiments.
Here, we investigate the entanglement evolution of two qubits in an $mathcalAPT$-symmetric system.
Our findings reveal novel phenomena of entanglement evolution in the $mathcalAPT$-symmetric system.
arXiv Detail & Related papers (2022-07-14T08:48:49Z) - Learning quantum symmetries with interactive quantum-classical
variational algorithms [0.0]
A symmetry of a state $vert psi rangle$ is a unitary operator of which $vert psi rangle$ is an eigenvector.
symmetries provide key physical insight into the quantum system.
We develop a variational hybrid quantum-classical learning scheme to systematically probe for symmetries of $vert psi rangle$.
arXiv Detail & Related papers (2022-06-23T20:41:26Z) - Boundary time crystals in collective $d$-level systems [64.76138964691705]
Boundary time crystals are non-equilibrium phases of matter occurring in quantum systems in contact to an environment.
We study BTC's in collective $d$-level systems, focusing in the cases with $d=2$, $3$ and $4$.
arXiv Detail & Related papers (2021-02-05T19:00:45Z) - Keldysh Rotation in the Large-N Expansion and String Theory Out of
Equilibrium [0.0]
We extend our study of the large-$N$ expansion of general non-equilibrium many-body systems with matrix degrees of freedom $M$.
We develop a novel "signpost" notation for non-equilibrium Feynman diagrams in the Keldysh-rotated form.
arXiv Detail & Related papers (2020-10-20T23:30:21Z) - Sub-bosonic (deformed) ladder operators [62.997667081978825]
We present a class of deformed creation and annihilation operators that originates from a rigorous notion of fuzziness.
This leads to deformed, sub-bosonic commutation relations inducing a simple algebraic structure with modified eigenenergies and Fock states.
In addition, we investigate possible consequences of the introduced formalism in quantum field theories, as for instance, deviations from linearity in the dispersion relation for free quasibosons.
arXiv Detail & Related papers (2020-09-10T20:53:58Z) - The world as a neural network [0.0]
We discuss a possibility that the universe on its most fundamental level is a neural network.
We identify two different types of dynamical degrees of freedom: "trainable" variables and "hidden" variables.
We argue that the entropy production in such a system is a local function of the symmetries of the Onsager-Hilbert term.
arXiv Detail & Related papers (2020-08-04T17:10:46Z) - Learning nonlinear dynamical systems from a single trajectory [102.60042167341956]
We introduce algorithms for learning nonlinear dynamical systems of the form $x_t+1=sigma(Thetastarx_t)+varepsilon_t$.
We give an algorithm that recovers the weight matrix $Thetastar$ from a single trajectory with optimal sample complexity and linear running time.
arXiv Detail & Related papers (2020-04-30T10:42:48Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z) - Backward Feature Correction: How Deep Learning Performs Deep
(Hierarchical) Learning [66.05472746340142]
This paper analyzes how multi-layer neural networks can perform hierarchical learning _efficiently_ and _automatically_ by SGD on the training objective.
We establish a new principle called "backward feature correction", where the errors in the lower-level features can be automatically corrected when training together with the higher-level layers.
arXiv Detail & Related papers (2020-01-13T17:28:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.