Complex Recurrent Spectral Network
- URL: http://arxiv.org/abs/2312.07296v1
- Date: Tue, 12 Dec 2023 14:14:40 GMT
- Title: Complex Recurrent Spectral Network
- Authors: Lorenzo Chicchi, Lorenzo Giambagli, Lorenzo Buffoni, Raffaele Marino,
Duccio Fanelli
- Abstract summary: This paper presents a novel approach to advancing artificial intelligence (AI) through the development of the Complex Recurrent Spectral Network ($mathbbC$-RSN)
The $mathbbC$-RSN is designed to address a critical limitation in existing neural network models: their inability to emulate the complex processes of biological neural networks.
- Score: 1.0499611180329806
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a novel approach to advancing artificial intelligence
(AI) through the development of the Complex Recurrent Spectral Network
($\mathbb{C}$-RSN), an innovative variant of the Recurrent Spectral Network
(RSN) model. The $\mathbb{C}$-RSN is designed to address a critical limitation
in existing neural network models: their inability to emulate the complex
processes of biological neural networks dynamically and accurately. By
integrating key concepts from dynamical systems theory and leveraging
principles from statistical mechanics, the $\mathbb{C}$-RSN model introduces
localized non-linearity, complex fixed eigenvalues, and a distinct separation
of memory and input processing functionalities. These features collectively
enable the $\mathbb{C}$-RSN evolving towards a dynamic, oscillating final state
that more closely mirrors biological cognition. Central to this work is the
exploration of how the $\mathbb{C}$-RSN manages to capture the rhythmic,
oscillatory dynamics intrinsic to biological systems, thanks to its complex
eigenvalue structure and the innovative segregation of its linear and
non-linear components. The model's ability to classify data through a
time-dependent function, and the localization of information processing, is
demonstrated with an empirical evaluation using the MNIST dataset. Remarkably,
distinct items supplied as a sequential input yield patterns in time which bear
the indirect imprint of the insertion order (and of the time of separation
between contiguous insertions).
Related papers
- Self-Organizing Recurrent Stochastic Configuration Networks for Nonstationary Data Modelling [3.8719670789415925]
Recurrent configuration networks (RSCNs) are a class of randomized models that have shown promise in modelling nonlinear dynamics.
This paper aims at developing a self-organizing version of RSCNs, termed as SORSCNs, to enhance the continuous learning ability of the network for modelling nonstationary data.
arXiv Detail & Related papers (2024-10-14T01:28:25Z) - Fuzzy Recurrent Stochastic Configuration Networks for Industrial Data Analytics [3.8719670789415925]
This paper presents a novel neuro-fuzzy model, termed fuzzy recurrent configuration networks (F-RSCNs) for industrial data analytics.
The proposed F-RSCN is constructed by multiple sub-reservoirs, and each sub-reservoir is associated with a Takagi-Sugeno-Kang (TSK) fuzzy rule.
By integrating TSK fuzzy inference systems into RSCNs, F-RSCNs have strong fuzzy inference capability and can achieve sound performance for both learning and generalization.
arXiv Detail & Related papers (2024-07-06T01:40:31Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Data-Driven Hybrid Automaton Framework to Modeling Complex Dynamical
Systems [2.610470075814367]
A data-driven hybrid automaton model is proposed to capture unknown complex dynamical system behaviors.
Small-scale neural networks are trained as the local dynamical description for their corresponding topologies.
A numerical example of the limit cycle is presented to illustrate that the developed models can significantly reduce the computational cost in reachable set computation.
arXiv Detail & Related papers (2023-04-26T20:18:12Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Coupled Oscillatory Recurrent Neural Network (coRNN): An accurate and
(gradient) stable architecture for learning long time dependencies [15.2292571922932]
We propose a novel architecture for recurrent neural networks.
Our proposed RNN is based on a time-discretization of a system of second-order ordinary differential equations.
Experiments show that the proposed RNN is comparable in performance to the state of the art on a variety of benchmarks.
arXiv Detail & Related papers (2020-10-02T12:35:04Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.