Binding threshold units with artificial oscillatory neurons
- URL: http://arxiv.org/abs/2505.03648v1
- Date: Tue, 06 May 2025 15:54:52 GMT
- Title: Binding threshold units with artificial oscillatory neurons
- Authors: Vladimir Fanaskov, Ivan Oseledets,
- Abstract summary: We present a theoretical framework that clearly distinguishes oscillatory neurons from threshold units and establishes a coupling mechanism between them.<n>We demonstrate the practical realization of this particular coupling through illustrative toy experiments.
- Score: 4.347494885647007
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Artificial Kuramoto oscillatory neurons were recently introduced as an alternative to threshold units. Empirical evidence suggests that oscillatory units outperform threshold units in several tasks including unsupervised object discovery and certain reasoning problems. The proposed coupling mechanism for these oscillatory neurons is heterogeneous, combining a generalized Kuramoto equation with standard coupling methods used for threshold units. In this research note, we present a theoretical framework that clearly distinguishes oscillatory neurons from threshold units and establishes a coupling mechanism between them. We argue that, from a biological standpoint, oscillatory and threshold units realise distinct aspects of neural coding: roughly, threshold units model intensity of neuron firing, while oscillatory units facilitate information exchange by frequency modulation. To derive interaction between these two types of units, we constrain their dynamics by focusing on dynamical systems that admit Lyapunov functions. For threshold units, this leads to Hopfield associative memory model, and for oscillatory units it yields a specific form of generalized Kuramoto model. The resulting dynamical systems can be naturally coupled to form a Hopfield-Kuramoto associative memory model, which also admits a Lyapunov function. Various forms of coupling are possible. Notably, oscillatory neurons can be employed to implement a low-rank correction to the weight matrix of a Hopfield network. This correction can be viewed either as a form of Hebbian learning or as a popular LoRA method used for fine-tuning of large language models. We demonstrate the practical realization of this particular coupling through illustrative toy experiments.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Transient Dynamics in Lattices of Differentiating Ring Oscillators [0.34952465649465553]
Recurrent neural networks (RNNs) are machine learning models widely used for learning temporal relationships.<n>We show via numerical simulation that large lattices of differentiating neuron rings exhibit local neural synchronization behavior.
arXiv Detail & Related papers (2025-06-08T18:29:15Z) - Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.<n>In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Covariant non-perturbative pointer variables for quantum fields [44.99833362998488]
We derive and renormalize the integro-differential equation that governs the detector pointer-variable dynamics.<n>Our formal solution, expressed in terms of Green's functions, allows for the covariant, and causal analysis of induced observables on the field.
arXiv Detail & Related papers (2025-02-03T11:53:31Z) - Generative Modeling of Neural Dynamics via Latent Stochastic Differential Equations [1.5467259918426441]
We propose a framework for developing computational models of biological neural systems.<n>We employ a system of coupled differential equations with differentiable drift and diffusion functions.<n>We show that these hybrid models achieve competitive performance in predicting stimulus-evoked neural and behavioral responses.
arXiv Detail & Related papers (2024-12-01T09:36:03Z) - Efficiency of Dynamical Decoupling for (Almost) Any Spin-Boson Model [44.99833362998488]
We analytically study the dynamical decoupling of a two-level system coupled with a structured bosonic environment.
We find sufficient conditions under which dynamical decoupling works for such systems.
Our bounds reproduce the correct scaling in various relevant system parameters.
arXiv Detail & Related papers (2024-09-24T04:58:28Z) - Harnessing Neural Unit Dynamics for Effective and Scalable Class-Incremental Learning [38.09011520275557]
Class-incremental learning (CIL) aims to train a model to learn new classes from non-stationary data streams without forgetting old ones.
We propose a new kind of connectionist model by tailoring neural unit dynamics that adapt the behavior of neural networks for CIL.
arXiv Detail & Related papers (2024-06-04T15:47:03Z) - Integrating GNN and Neural ODEs for Estimating Non-Reciprocal Two-Body Interactions in Mixed-Species Collective Motion [0.0]
We present a novel deep learning framework for estimating the underlying equations of motion from observed trajectories.
Our framework integrates graph neural networks with neural differential equations, enabling effective prediction of two-body interactions.
arXiv Detail & Related papers (2024-05-26T09:47:17Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Solving the nuclear pairing model with neural network quantum states [58.720142291102135]
We present a variational Monte Carlo method that solves the nuclear many-body problem in the occupation number formalism.
A memory-efficient version of the reconfiguration algorithm is developed to train the network by minimizing the expectation value of the Hamiltonian.
arXiv Detail & Related papers (2022-11-09T00:18:01Z) - Understanding Neural Coding on Latent Manifolds by Sharing Features and
Dividing Ensembles [3.625425081454343]
Systems neuroscience relies on two complementary views of neural data, characterized by single neuron tuning curves and analysis of population activity.
These two perspectives combine elegantly in neural latent variable models that constrain the relationship between latent variables and neural activity.
We propose feature sharing across neural tuning curves, which significantly improves performance and leads to better-behaved optimization.
arXiv Detail & Related papers (2022-10-06T18:37:49Z) - Fermionic approach to variational quantum simulation of Kitaev spin
models [50.92854230325576]
Kitaev spin models are well known for being exactly solvable in a certain parameter regime via a mapping to free fermions.
We use classical simulations to explore a novel variational ansatz that takes advantage of this fermionic representation.
We also comment on the implications of our results for simulating non-Abelian anyons on quantum computers.
arXiv Detail & Related papers (2022-04-11T18:00:01Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Solving the Bose-Hubbard model in new ways [0.0]
We introduce a new method for analysing the Bose-Hubbard model for an array of bosons with nearest neighbor interactions.
It is based on a number-theoretic implementation of the creation and annihilation operators that constitute the model.
We provide a rigorous computer assisted proof of quantum phase transitions in finite systems of this type.
arXiv Detail & Related papers (2021-06-17T08:41:37Z) - Memory kernel and divisibility of Gaussian Collisional Models [0.0]
Memory effects in the dynamics of open systems have been the subject of significant interest in the last decades.
We analyze two types of interactions, a beam-splitter implementing a partial SWAP and a two-mode squeezing, which entangles the ancillas and feeds excitations into the system.
By analyzing the memory kernel and divisibility for these two representative scenarios, our results help to shed light on the intricate mechanisms behind memory effects in the quantum domain.
arXiv Detail & Related papers (2020-08-03T10:28:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.