Relating Superconducting Optoelectronic Networks to Classical Neurodynamics
- URL: http://arxiv.org/abs/2409.18016v1
- Date: Thu, 26 Sep 2024 16:23:53 GMT
- Title: Relating Superconducting Optoelectronic Networks to Classical Neurodynamics
- Authors: Jeffrey M. Shainline, Bryce A. Primavera, Ryan O'Loughlin,
- Abstract summary: We present a phenomenological model of superconducting loop neurons that eliminates the need to solve the Josephson circuit equations that describe synapses and dendrites.
For some circuit parameters it is possible to represent the downstream dendritic response to a single spike as well as coincidences or sequences of spikes.
The governing equations are shown to be nearly identical to those ubiquitous in the neuroscience literature for modeling leaky-integrator dendrites and neurons.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The circuits comprising superconducting optoelectronic synapses, dendrites, and neurons are described by numerically cumbersome and formally opaque coupled differential equations. Reference 1 showed that a phenomenological model of superconducting loop neurons eliminates the need to solve the Josephson circuit equations that describe synapses and dendrites. The initial goal of the model was to decrease the time required for simulations, yet an additional benefit of the model was increased transparency of the underlying neural circuit operations and conceptual clarity regarding the connection of loop neurons to other physical systems. Whereas the original model simplified the treatment of the Josephson-junction dynamics, essentially by only considering low-pass versions of the dendritic outputs, the model resorted to an awkward treatment of spikes generated by semiconductor transmitter circuits that required explicitly checking for threshold crossings and distinct treatment of time steps wherein somatic threshold is reached. Here we extend that model to simplify the treatment of spikes coming from somas, again making use of the fact that in neural systems the downstream recipients of spike events almost always perform low-pass filtering. We provide comparisons between the first and second phenomenological models, quantifying the accuracy of the additional approximations. We identify regions of circuit parameter space in which the extended model works well and regions where it works poorly. For some circuit parameters it is possible to represent the downstream dendritic response to a single spike as well as coincidences or sequences of spikes, indicating the model is not simply a reduction to rate coding. The governing equations are shown to be nearly identical to those ubiquitous in the neuroscience literature for modeling leaky-integrator dendrites and neurons.
Related papers
- On the Trade-off Between Efficiency and Precision of Neural Abstraction [62.046646433536104]
Neural abstractions have been recently introduced as formal approximations of complex, nonlinear dynamical models.
We employ formal inductive synthesis procedures to generate neural abstractions that result in dynamical models with these semantics.
arXiv Detail & Related papers (2023-07-28T13:22:32Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - A Bio-Inspired Chaos Sensor Model Based on the Perceptron Neural
Network: Machine Learning Concept and Application for Computational
Neuro-Science [0.0]
The study presents a bio-inspired chaos sensor model based on the perceptron neural network for the estimation of entropy of spike train in neurodynamic systems.
The model is able to dynamically track the chaotic behavior of a spike signal and transmit this information to other parts of the neurodynamic model for further processing.
arXiv Detail & Related papers (2023-06-03T03:36:47Z) - Phenomenological Model of Superconducting Optoelectronic Loop Neurons [0.0]
Superconducting optoelectronic loop neurons are a class of circuits potentially conducive to networks for large-scale artificial cognition.
To date, all simulations of loop neurons have used first-principles circuit analysis to model the behavior of synapses, dendrites, and neurons.
Here we introduce a modeling framework that captures the behavior of the relevant synaptic, dendritic, and neuronal circuits.
arXiv Detail & Related papers (2022-10-18T16:38:35Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Recurrent networks improve neural response prediction and provide
insights into underlying cortical circuits [3.340380180141713]
CNN models have proven themselves as state-of-the-art models for predicting single-neuron responses to natural images in early visual cortical neurons.
We extend these models with recurrent convolutional layers, reflecting the well-known massive recurrence in the cortex.
We find that the hidden units in the recurrent circuits of the appropriate models, when trained on long-duration wide-field image presentations, exhibit similar temporal response dynamics and classical contextual modulations as observed in V1 neurons.
arXiv Detail & Related papers (2021-10-02T15:46:56Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Dissipative Rabi model in the dispersive regime [0.0]
We present results on the dispersive regime of the dissipative Rabi model without taking the rotating wave approximation of the underlying Hamiltonian.
Results additionally predict new types of drive induced qubit dissipation and dephasing, not present in previous theories.
arXiv Detail & Related papers (2020-04-06T09:45:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.