Multi-Graph Tensor Networks
- URL: http://arxiv.org/abs/2010.13209v4
- Date: Thu, 21 Jan 2021 10:04:25 GMT
- Title: Multi-Graph Tensor Networks
- Authors: Yao Lei Xu, Kriton Konstantinidis, Danilo P. Mandic
- Abstract summary: We introduce a novel Multi-Graph Network (MGTN) framework, which exploits the ability of graphs to handle irregular data sources and the compression properties of tensor networks in a deep learning setting.
By virtue of the MGTN, a FOREX currency graph is leveraged to impose an economically meaningful structure on this demanding task, resulting in a highly superior performance against three competing models and at a drastically lower complexity.
- Score: 23.030263841031633
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The irregular and multi-modal nature of numerous modern data sources poses
serious challenges for traditional deep learning algorithms. To this end,
recent efforts have generalized existing algorithms to irregular domains
through graphs, with the aim to gain additional insights from data through the
underlying graph topology. At the same time, tensor-based methods have
demonstrated promising results in bypassing the bottlenecks imposed by the
Curse of Dimensionality. In this paper, we introduce a novel Multi-Graph Tensor
Network (MGTN) framework, which exploits both the ability of graphs to handle
irregular data sources and the compression properties of tensor networks in a
deep learning setting. The potential of the proposed framework is demonstrated
through an MGTN based deep Q agent for Foreign Exchange (FOREX) algorithmic
trading. By virtue of the MGTN, a FOREX currency graph is leveraged to impose
an economically meaningful structure on this demanding task, resulting in a
highly superior performance against three competing models and at a drastically
lower complexity.
Related papers
- DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - Tensor-view Topological Graph Neural Network [16.433092191206534]
Graph neural networks (GNNs) have recently gained growing attention in graph learning.
Existing GNNs only use local information from a very limited neighborhood around each node.
We propose a novel Topological Graph Neural Network (TTG-NN), a class of simple yet effective deep learning.
Real data experiments show that the proposed TTG-NN outperforms 20 state-of-the-art methods on various graph benchmarks.
arXiv Detail & Related papers (2024-01-22T14:55:01Z) - Network Alignment with Transferable Graph Autoencoders [79.89704126746204]
We propose a novel graph autoencoder architecture designed to extract powerful and robust node embeddings.
We prove that the generated embeddings are associated with the eigenvalues and eigenvectors of the graphs.
Our proposed framework also leverages transfer learning and data augmentation to achieve efficient network alignment at a very large scale without retraining.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - GRANDE: a neural model over directed multigraphs with application to
anti-money laundering [20.113306761523713]
We develop a novel GNN protocol that overcomes challenges via efficiently incorporating directional information.
We propose an enhancement that targets edge-related tasks using a novel message passing scheme over an extension of edge-to-node dual graph.
A concrete GNN architecture called GRANDE is derived using the proposed protocol.
arXiv Detail & Related papers (2023-02-04T05:54:25Z) - Mastering Spatial Graph Prediction of Road Networks [18.321172168775472]
We propose a graph-based framework that simulates the addition of sequences of graph edges.
In particular, given a partially generated graph associated with a satellite image, an RL agent nominates modifications that maximize a cumulative reward.
arXiv Detail & Related papers (2022-10-03T11:26:09Z) - Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural
Networks [52.566735716983956]
We propose a graph gradual pruning framework termed CGP to dynamically prune GNNs.
Unlike LTH-based methods, the proposed CGP approach requires no re-training, which significantly reduces the computation costs.
Our proposed strategy greatly improves both training and inference efficiency while matching or even exceeding the accuracy of existing methods.
arXiv Detail & Related papers (2022-07-18T14:23:31Z) - Tensor Networks for Multi-Modal Non-Euclidean Data [24.50116388903113]
We introduce a novel Multi-Graph Network (MGTN) framework, which leverages on the desirable properties of graphs, tensors and neural networks in a physically meaningful and compact manner.
This equips MGTNs with the ability to exploit local information in irregular data sources at a drastically reduced parameter complexity.
The benefits of the MGTN framework, especially its ability to avoid overfitting through the inherent low-rank regularization properties of tensor networks, are demonstrated.
arXiv Detail & Related papers (2021-03-27T21:33:46Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.