Exploring Gain-Doped-Waveguide-Synapse for Neuromorphic Applications: A Pulsed Pump-Signal Approach
- URL: http://arxiv.org/abs/2507.05931v1
- Date: Tue, 08 Jul 2025 12:29:48 GMT
- Title: Exploring Gain-Doped-Waveguide-Synapse for Neuromorphic Applications: A Pulsed Pump-Signal Approach
- Authors: Robert Otupiri, Ripalta Stabile,
- Abstract summary: We introduce the concept of Gain on Waveguide Dynamics for synapses, demonstrating how non-linear pulse transformations of input probe signals occur under various pump-probe configurations.<n>By harnessing the complex interactions of asynchronous spiking pump techniques and ion densities in excited states, our method produces event-driven responses that mirror natural neuronal functions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neuromorphic computing promises to transform AI systems by enabling them to perceive, respond to, and adapt swiftly and accurately to dynamic data and user interactions. However, traditional silicon-based and hybrid electronic technologies for artificial neurons constrain neuromorphic processors in terms of flexibility, scalability, and energy efficiency. In this study, we pioneer the use of Doped-Gain-Layer-on-Waveguide-Synapses for bio-inspired neurons, utilizing a pulsed pump-signal mechanism to enhance neuromorphic computation. This approach addresses critical challenges in scalability and energy efficiency inherent in current technologies. We introduce the concept of Gain on Waveguide Dynamics for synapses, demonstrating how non-linear pulse transformations of input probe signals occur under various pump-probe configurations. Our findings reveal that primarily properties of pulse amplitude, period as well material properties such as doping densities and population dynamics influence strongly the generation of spiking responses that emulate neuronal behaviour and effectively how computational logic is. By harnessing the complex interactions of asynchronous spiking pump techniques and ion densities in excited states, our method produces event-driven responses that mirror natural neuronal functions. This gain-enhanced environment supports short-term memory capabilities alongside essential characteristics like asynchronous spike generation, threshold operation, and temporal integration, foundational to brain-inspired spiking neural network paradigms.
Related papers
- Pendulum Model of Spiking Neurons [0.0]
We propose a biologically inspired model of spiking neurons based on the dynamics of a damped, driven pendulum.<n>We present an analysis of single-neuron dynamics and extend the model to multi-neuron layers governed by Spike-Timing Dependent Plasticity (STDP) learning rules.
arXiv Detail & Related papers (2025-07-29T18:21:51Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - NOBLE -- Neural Operator with Biologically-informed Latent Embeddings to Capture Experimental Variability in Biological Neuron Models [68.89389652724378]
NOBLE is a neural operator framework that learns a mapping from a continuous frequency-modulated embedding of interpretable neuron features to the somatic voltage response induced by current injection.<n>It predicts distributions of neural dynamics accounting for the intrinsic experimental variability.<n>NOBLE is the first scaled-up deep learning framework validated on real experimental data.
arXiv Detail & Related papers (2025-06-05T01:01:18Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - The Neuron as a Direct Data-Driven Controller [43.8450722109081]
This study extends the current normative models, which primarily optimize prediction, by conceptualizing neurons as optimal feedback controllers.
We model neurons as biologically feasible controllers which implicitly identify loop dynamics, infer latent states and optimize control.
Our model presents a significant departure from the traditional, feedforward, instant-response McCulloch-Pitts-Rosenblatt neuron, offering a novel and biologically-informed fundamental unit for constructing neural networks.
arXiv Detail & Related papers (2024-01-03T01:24:10Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - An accurate and flexible analog emulation of AdEx neuron dynamics in
silicon [0.0]
This manuscript presents the analog neuron circuits of the mixed-signal accelerated neuromorphic system BrainScaleS-2.
They are capable of flexibly and accurately emulating the adaptive exponential integrate-and-fire model equations in combination with both current- and conductance-based synapses.
arXiv Detail & Related papers (2022-09-19T18:08:23Z) - Evolving spiking neuron cellular automata and networks to emulate in
vitro neuronal activity [0.0]
We produce spiking neural systems that emulate the patterns of behavior of biological neurons in vitro.
Our models were able to produce a level of network-wide synchrony.
The genomes of the top-performing models indicate the excitability and density of connections in the model play an important role in determining the complexity of the produced activity.
arXiv Detail & Related papers (2021-10-15T17:55:04Z) - Intrinsic Spike Timing Dependent Plasticity in Stochastic Magnetic
Tunnel Junctions Mediated by Heat Dynamics [0.0]
Neuromorphic computing aims to mimic the behavior of biological neurons and synapses using solid-state devices and circuits.
We propose a method to implement the Spike Timing Dependent Plasticity (STDP) behavior of biological synapses in Magnetic Tunnel Junction (MTJ) devices.
arXiv Detail & Related papers (2021-08-28T18:02:01Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Ultra-Low-Power FDSOI Neural Circuits for Extreme-Edge Neuromorphic
Intelligence [2.6199663901387997]
In-memory computing mixed-signal neuromorphic architectures provide promising ultra-low-power solutions for edge-computing sensory-processing applications.
We present a set of mixed-signal analog/digital circuits that exploit the features of advanced Fully-Depleted Silicon on Insulator (FDSOI) integration processes.
arXiv Detail & Related papers (2020-06-25T09:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.