Emergent learning in physical systems as feedback-based aging in a
glassy landscape
- URL: http://arxiv.org/abs/2309.04382v2
- Date: Tue, 31 Oct 2023 01:19:15 GMT
- Title: Emergent learning in physical systems as feedback-based aging in a
glassy landscape
- Authors: Vidyesh Rao Anisetti, Ananth Kandala, J. M. Schwarz
- Abstract summary: We show that the learning dynamics resembles an aging process, where the system relaxes in response to repeated application of the feedback boundary forces.
We also observe that the square root of the mean-squared error as a function of epoch takes on a non-exponential form, which is a typical feature of glassy systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: By training linear physical networks to learn linear transformations, we
discern how their physical properties evolve due to weight update rules. Our
findings highlight a striking similarity between the learning behaviors of such
networks and the processes of aging and memory formation in disordered and
glassy systems. We show that the learning dynamics resembles an aging process,
where the system relaxes in response to repeated application of the feedback
boundary forces in presence of an input force, thus encoding a memory of the
input-output relationship. With this relaxation comes an increase in the
correlation length, which is indicated by the two-point correlation function
for the components of the network. We also observe that the square root of the
mean-squared error as a function of epoch takes on a non-exponential form,
which is a typical feature of glassy systems. This physical interpretation
suggests that by encoding more detailed information into input and feedback
boundary forces, the process of emergent learning can be rather ubiquitous and,
thus, serve as a very early physical mechanism, from an evolutionary
standpoint, for learning in biological systems.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - A simple theory for training response of deep neural networks [0.0]
Deep neural networks give us a powerful method to model the training dataset's relationship between input and output.
We show the training response consists of some different factors based on training stages, activation functions, or training methods.
In addition, we show feature space reduction as an effect of training dynamics, which can result in network fragility.
arXiv Detail & Related papers (2024-05-07T07:20:15Z) - A Waddington landscape for prototype learning in generalized Hopfield
networks [0.0]
We study the learning dynamics of Generalized Hopfield networks.
We observe a strong resemblance to the canalized, or low-dimensional, dynamics of cells as they differentiate.
arXiv Detail & Related papers (2023-12-04T21:28:14Z) - Critical Learning Periods for Multisensory Integration in Deep Networks [112.40005682521638]
We show that the ability of a neural network to integrate information from diverse sources hinges critically on being exposed to properly correlated signals during the early phases of training.
We show that critical periods arise from the complex and unstable early transient dynamics, which are decisive of final performance of the trained system and their learned representations.
arXiv Detail & Related papers (2022-10-06T23:50:38Z) - Synergistic information supports modality integration and flexible
learning in neural networks solving multiple tasks [107.8565143456161]
We investigate the information processing strategies adopted by simple artificial neural networks performing a variety of cognitive tasks.
Results show that synergy increases as neural networks learn multiple diverse tasks.
randomly turning off neurons during training through dropout increases network redundancy, corresponding to an increase in robustness.
arXiv Detail & Related papers (2022-10-06T15:36:27Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - A neural anisotropic view of underspecification in deep learning [60.119023683371736]
We show that the way neural networks handle the underspecification of problems is highly dependent on the data representation.
Our results highlight that understanding the architectural inductive bias in deep learning is fundamental to address the fairness, robustness, and generalization of these systems.
arXiv Detail & Related papers (2021-04-29T14:31:09Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - A Framework for Learning Invariant Physical Relations in Multimodal
Sensory Processing [0.0]
We design a novel neural network architecture capable of learning, in an unsupervised manner, relations among sensory cues.
We describe the core system functionality when learning arbitrary non-linear relations in low-dimensional sensory data.
We demonstrate this through a real-world learning problem, where, from standard RGB camera frames, the network learns the relations between physical quantities.
arXiv Detail & Related papers (2020-06-30T08:42:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.