Highly connected dynamic artificial neural networks
- URL: http://arxiv.org/abs/2302.08928v1
- Date: Fri, 17 Feb 2023 15:05:29 GMT
- Title: Highly connected dynamic artificial neural networks
- Authors: Clint van Alten
- Abstract summary: An object-oriented approach to implementing artificial neural networks is introduced.
Networks are highly connected in that they admit edges between nodes in any layers of the network.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: An object-oriented approach to implementing artificial neural networks is
introduced in this article. The networks obtained in this way are highly
connected in that they admit edges between nodes in any layers of the network,
and dynamic, in that the insertion, or deletion, of nodes, edges or layers of
nodes can be effected in a straightforward way. In addition, the activation
functions of nodes need not be uniform within layers, and can also be changed
within individual nodes. Methods for implementing the feedforward step and the
backpropagation technique in such networks are presented here. Methods for
creating networks, for implementing the various dynamic properties and for
saving and recreating networks are also described.
Related papers
- Unifying Structural Proximity and Equivalence for Enhanced Dynamic Network Embedding [1.6044444452278062]
This paper proposes a novel unifying dynamic network embedding method that simultaneously preserves both structural proximity and equivalence.
We then introduce a temporal-structural random walk to flexibly sample time-respecting sequences of nodes, considering both their temporal proximity and similarity in evolving structures.
The proposed method is evaluated using five real-world networks on node classification where it outperforms benchmark methods.
arXiv Detail & Related papers (2025-03-14T08:40:05Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Conditional computation in neural networks: principles and research trends [48.14569369912931]
This article summarizes principles and ideas from the emerging area of applying textitconditional computation methods to the design of neural networks.
In particular, we focus on neural networks that can dynamically activate or de-activate parts of their computational graph conditionally on their input.
arXiv Detail & Related papers (2024-03-12T11:56:38Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Associative Learning for Network Embedding [20.873120242498292]
We introduce a network embedding method from a new perspective.
Our network learns associations between the content of each node and that node's neighbors.
Our proposed method is evaluated on different downstream tasks such as node classification and linkage prediction.
arXiv Detail & Related papers (2022-08-30T16:35:45Z) - DynACPD Embedding Algorithm for Prediction Tasks in Dynamic Networks [6.5361928329696335]
We present novel embedding methods for a dynamic network based on higher order tensor decompositions for tensorial representations of the dynamic network.
We demonstrate the power and efficiency of our approach by comparing our algorithms' performance on the link prediction task against an array of current baseline methods.
arXiv Detail & Related papers (2021-03-12T04:36:42Z) - Artificial Neural Networks generated by Low Discrepancy Sequences [59.51653996175648]
We generate artificial neural networks as random walks on a dense network graph.
Such networks can be trained sparse from scratch, avoiding the expensive procedure of training a dense network and compressing it afterwards.
We demonstrate that the artificial neural networks generated by low discrepancy sequences can achieve an accuracy within reach of their dense counterparts at a much lower computational complexity.
arXiv Detail & Related papers (2021-03-05T08:45:43Z) - Lattice Fusion Networks for Image Denoising [4.010371060637209]
A novel method for feature fusion in convolutional neural networks is proposed in this paper.
Some of these techniques as well as the proposed network can be considered a type of Directed Acyclic Graph (DAG) Network.
The proposed network is able to achieve better results with far fewer learnable parameters.
arXiv Detail & Related papers (2020-11-28T18:57:54Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Adversarial Deep Network Embedding for Cross-network Node Classification [27.777464531860325]
Cross-network node classification leverages the abundant labeled nodes from a source network to help classify unlabeled nodes in a target network.
In this paper, we propose an adversarial cross-network deep network embedding model to integrate adversarial domain adaptation with deep network embedding.
The proposed ACDNE model achieves the state-of-the-art performance in cross-network node classification.
arXiv Detail & Related papers (2020-02-18T04:30:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.