Cyclic Neural Network
- URL: http://arxiv.org/abs/2402.03332v1
- Date: Thu, 11 Jan 2024 07:31:53 GMT
- Title: Cyclic Neural Network
- Authors: Liangwei Yang, Hengrui Zhang, Zihe Song, Jiawei Zhang, Weizhi Zhang,
Jing Ma, Philip S. Yu
- Abstract summary: We introduce the groundbreaking Cyclic Neural Networks (Cyclic NNs)
It emulates the flexible and dynamic graph nature of biological neural systems, allowing neuron connections in any graph-like structure, including cycles.
We develop the Graph Over Multi-layer Perceptron, which is the first detailed model based on this new design paradigm.
- Score: 46.05071312173701
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper answers a fundamental question in artificial neural network (ANN)
design: We do not need to build ANNs layer-by-layer sequentially to guarantee
the Directed Acyclic Graph (DAG) property. Drawing inspiration from biological
intelligence (BI), where neurons form a complex, graph-structured network, we
introduce the groundbreaking Cyclic Neural Networks (Cyclic NNs). It emulates
the flexible and dynamic graph nature of biological neural systems, allowing
neuron connections in any graph-like structure, including cycles. This offers
greater adaptability compared to the DAG structure of current ANNs. We further
develop the Graph Over Multi-layer Perceptron, which is the first detailed
model based on this new design paradigm. Experimental validation of the Cyclic
NN's advantages on widely tested datasets in most generalized cases,
demonstrating its superiority over current BP training methods through the use
of a forward-forward (FF) training algorithm. This research illustrates a
totally new ANN design paradigm, which is a significant departure from current
ANN designs, potentially leading to more biologically plausible AI systems.
Related papers
- DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - Web Neural Network with Complete DiGraphs [8.2727500676707]
Current neural networks have structures that vaguely mimic the brain structure, such as neurons, convolutions, and recurrence.
The model proposed in this paper adds additional structural properties by introducing cycles into the neuron connections and removing the sequential nature commonly seen in other network layers.
Furthermore, the model has continuous input and output, inspired by spiking neural networks, which allows the network to learn a process of classification, rather than simply returning the final result.
arXiv Detail & Related papers (2024-01-07T05:12:10Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Characterizing Learning Dynamics of Deep Neural Networks via Complex
Networks [1.0869257688521987]
Complex Network Theory (CNT) represents Deep Neural Networks (DNNs) as directed weighted graphs to study them as dynamical systems.
We introduce metrics for nodes/neurons and layers, namely Nodes Strength and Layers Fluctuation.
Our framework distills trends in the learning dynamics and separates low from high accurate networks.
arXiv Detail & Related papers (2021-10-06T10:03:32Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Directed Acyclic Graph Neural Networks [9.420935957200518]
We focus on a special, yet widely used, type of graphs -- DAGs -- and inject a stronger inductive bias -- partial ordering -- into the neural network design.
We propose the emphdirected acyclic graph relational neural network, DAGNN, an architecture that processes information according to the flow defined by the partial order.
arXiv Detail & Related papers (2021-01-20T04:50:16Z) - Evolutionary Architecture Search for Graph Neural Networks [23.691915813153496]
We propose a novel AutoML framework through the evolution of individual models in a large Graph Neural Networks (GNN) architecture space.
To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models.
arXiv Detail & Related papers (2020-09-21T22:11:53Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.