An exact mathematical description of computation with transient
spatiotemporal dynamics in a complex-valued neural network
- URL: http://arxiv.org/abs/2311.16431v1
- Date: Tue, 28 Nov 2023 02:23:30 GMT
- Title: An exact mathematical description of computation with transient
spatiotemporal dynamics in a complex-valued neural network
- Authors: Roberto C. Budzinski, Alexandra N. Busch, Samuel Mestern, Erwan
Martin, Luisa H. B. Liboni, Federico W. Pasini, J\'an Min\'a\v{c}, Todd
Coleman, Wataru Inoue, Lyle E. Muller
- Abstract summary: We study a complex-valued neural network (-NN) with linear time-delayed interactions.
cv-NN displays sophisticated dynamics, including partially synchronized chimera adaptable'' states.
We demonstrate that computations in cv-NN computation are decodable by living biological neurons.
- Score: 33.7054351451505
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study a complex-valued neural network (cv-NN) with linear, time-delayed
interactions. We report the cv-NN displays sophisticated spatiotemporal
dynamics, including partially synchronized ``chimera'' states. We then use
these spatiotemporal dynamics, in combination with a nonlinear readout, for
computation. The cv-NN can instantiate dynamics-based logic gates, encode
short-term memories, and mediate secure message passing through a combination
of interactions and time delays. The computations in this system can be fully
described in an exact, closed-form mathematical expression. Finally, using
direct intracellular recordings of neurons in slices from neocortex, we
demonstrate that computations in the cv-NN are decodable by living biological
neurons. These results demonstrate that complex-valued linear systems can
perform sophisticated computations, while also being exactly solvable. Taken
together, these results open future avenues for design of highly adaptable,
bio-hybrid computing systems that can interface seamlessly with other neural
networks.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Hardware-Friendly Implementation of Physical Reservoir Computing with CMOS-based Time-domain Analog Spiking Neurons [0.26963330643873434]
This paper introduces a spiking neural network (SNN) for a hardware-friendly physical reservoir computing (RC) on a complementary metal-oxide-semiconductor (CMOS) platform.
We demonstrate RC through short-term memory and exclusive OR tasks, and the spoken digit recognition task with an accuracy of 97.7%.
arXiv Detail & Related papers (2024-09-18T00:23:00Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - On the Computational Complexities of Complex-valued Neural Networks [0.0]
Complex-valued neural networks (CVNNs) are nonlinear filters used in the digital signal processing of complex-domain data.
This paper presents both the quantitative and computational complexities of CVNNs.
arXiv Detail & Related papers (2023-10-19T18:14:04Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Neural Operator Learning for Long-Time Integration in Dynamical Systems with Recurrent Neural Networks [1.6874375111244329]
Deep neural networks offer reduced computational costs during inference and can be trained directly from observational data.
Existing methods, however, cannot extrapolate accurately and are prone to error accumulation in long-time integration.
We address this issue by combining neural operators with recurrent neural networks, learning the operator mapping, while offering a recurrent structure to capture temporal dependencies.
arXiv Detail & Related papers (2023-03-03T22:19:23Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Nonlinear computations in spiking neural networks through multiplicative
synapses [3.1498833540989413]
nonlinear computations can be implemented successfully in spiking neural networks.
This requires supervised training and the resulting connectivity can be hard to interpret.
We show how to directly derive the required connectivity for several nonlinear dynamical systems.
arXiv Detail & Related papers (2020-09-08T16:47:27Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.