Phenomenological Model of Superconducting Optoelectronic Loop Neurons
- URL: http://arxiv.org/abs/2210.09976v1
- Date: Tue, 18 Oct 2022 16:38:35 GMT
- Title: Phenomenological Model of Superconducting Optoelectronic Loop Neurons
- Authors: Jeffrey M. Shainline, Bryce A. Primavera, and Saeed Khan
- Abstract summary: Superconducting optoelectronic loop neurons are a class of circuits potentially conducive to networks for large-scale artificial cognition.
To date, all simulations of loop neurons have used first-principles circuit analysis to model the behavior of synapses, dendrites, and neurons.
Here we introduce a modeling framework that captures the behavior of the relevant synaptic, dendritic, and neuronal circuits.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Superconducting optoelectronic loop neurons are a class of circuits
potentially conducive to networks for large-scale artificial cognition. These
circuits employ superconducting components including single-photon detectors,
Josephson junctions, and transformers to achieve neuromorphic functions. To
date, all simulations of loop neurons have used first-principles circuit
analysis to model the behavior of synapses, dendrites, and neurons. These
circuit models are computationally inefficient and leave opaque the
relationship between loop neurons and other complex systems. Here we introduce
a modeling framework that captures the behavior of the relevant synaptic,
dendritic, and neuronal circuits at a phenomenological level without resorting
to full circuit equations. Within this compact model, each dendrite is
discovered to obey a single nonlinear leaky-integrator ordinary differential
equation, while a neuron is modeled as a dendrite with a thresholding element
and an additional feedback mechanism for establishing a refractory period. A
synapse is modeled as a single-photon detector coupled to a dendrite, where the
response of the single-photon detector follows a closed-form expression. We
quantify the accuracy of the phenomenological model relative to circuit
simulations and find that the approach reduces computational time by a factor
of ten thousand while maintaining accuracy of one part in ten thousand. We
demonstrate the use of the model with several basic examples. The net increase
in computational efficiency enables future simulation of large networks, while
the formulation provides a connection to a large body of work in applied
mathematics, computational neuroscience, and physical systems such as spin
glasses.
Related papers
- Relating Superconducting Optoelectronic Networks to Classical Neurodynamics [0.0]
We present a phenomenological model of superconducting loop neurons that eliminates the need to solve the Josephson circuit equations that describe synapses and dendrites.
For some circuit parameters it is possible to represent the downstream dendritic response to a single spike as well as coincidences or sequences of spikes.
The governing equations are shown to be nearly identical to those ubiquitous in the neuroscience literature for modeling leaky-integrator dendrites and neurons.
arXiv Detail & Related papers (2024-09-26T16:23:53Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Pairing-based graph neural network for simulating quantum materials [0.8192907805418583]
We develop a pairing-based graph neural network for simulating quantum many-body systems.
Variational Monte Carlo with our neural network simultaneously provides an accurate, flexible, and scalable method for simulating many-electron systems.
arXiv Detail & Related papers (2023-11-03T17:12:29Z) - A versatile circuit for emulating active biological dendrites applied to
sound localisation and neuron imitation [0.0]
We introduce a versatile circuit that emulates a segment of a dendrite which exhibits gain, introduces delays, and performs integration.
We also find that dendrites can form bursting neurons.
This significant discovery suggests the potential to fabricate neural networks solely comprised of dendrite circuits.
arXiv Detail & Related papers (2023-10-25T09:42:24Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - A Bio-Inspired Chaos Sensor Model Based on the Perceptron Neural
Network: Machine Learning Concept and Application for Computational
Neuro-Science [0.0]
The study presents a bio-inspired chaos sensor model based on the perceptron neural network for the estimation of entropy of spike train in neurodynamic systems.
The model is able to dynamically track the chaotic behavior of a spike signal and transmit this information to other parts of the neurodynamic model for further processing.
arXiv Detail & Related papers (2023-06-03T03:36:47Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.