Graph Tensor Networks: An Intuitive Framework for Designing Large-Scale
Neural Learning Systems on Multiple Domains
- URL: http://arxiv.org/abs/2303.13565v1
- Date: Thu, 23 Mar 2023 13:05:35 GMT
- Title: Graph Tensor Networks: An Intuitive Framework for Designing Large-Scale
Neural Learning Systems on Multiple Domains
- Authors: Yao Lei Xu, Kriton Konstantinidis, Danilo P. Mandic
- Abstract summary: We introduce the Graph Network (GTN) framework for designing and implementing large-scale neural learning systems.
The proposed framework is shown to be general enough to include many popular architectures as special cases, and flexible enough to handle data on any and many data domains.
- Score: 23.030263841031633
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the omnipresence of tensors and tensor operations in modern deep
learning, the use of tensor mathematics to formally design and describe neural
networks is still under-explored within the deep learning community. To this
end, we introduce the Graph Tensor Network (GTN) framework, an intuitive yet
rigorous graphical framework for systematically designing and implementing
large-scale neural learning systems on both regular and irregular domains. The
proposed framework is shown to be general enough to include many popular
architectures as special cases, and flexible enough to handle data on any and
many data domains. The power and flexibility of the proposed framework is
demonstrated through real-data experiments, resulting in improved performance
at a drastically lower complexity costs, by virtue of tensor algebra.
Related papers
- Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - LasTGL: An Industrial Framework for Large-Scale Temporal Graph Learning [61.4707298969173]
We introduce LasTGL, an industrial framework that integrates unified and unified implementations of common temporal graph learning algorithms.
LasTGL provides comprehensive temporal graph datasets, TGNN models and utilities along with well-documented tutorials.
arXiv Detail & Related papers (2023-11-28T08:45:37Z) - MAgNET: A Graph U-Net Architecture for Mesh-Based Simulations [0.5185522256407782]
We present MAgNET, which extends the well-known convolutional neural networks to accommodate arbitrary graph-structured data.
We demonstrate the predictive capabilities of MAgNET in surrogate modeling for non-linear finite element simulations in the mechanics of solids.
arXiv Detail & Related papers (2022-11-01T19:23:45Z) - Convolutional Learning on Multigraphs [153.20329791008095]
We develop convolutional information processing on multigraphs and introduce convolutional multigraph neural networks (MGNNs)
To capture the complex dynamics of information diffusion within and across each of the multigraph's classes of edges, we formalize a convolutional signal processing model.
We develop a multigraph learning architecture, including a sampling procedure to reduce computational complexity.
The introduced architecture is applied towards optimal wireless resource allocation and a hate speech localization task, offering improved performance over traditional graph neural networks.
arXiv Detail & Related papers (2022-09-23T00:33:04Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Learning through structure: towards deep neuromorphic knowledge graph
embeddings [0.5906031288935515]
We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
arXiv Detail & Related papers (2021-09-21T18:01:04Z) - Tensor Networks for Multi-Modal Non-Euclidean Data [24.50116388903113]
We introduce a novel Multi-Graph Network (MGTN) framework, which leverages on the desirable properties of graphs, tensors and neural networks in a physically meaningful and compact manner.
This equips MGTNs with the ability to exploit local information in irregular data sources at a drastically reduced parameter complexity.
The benefits of the MGTN framework, especially its ability to avoid overfitting through the inherent low-rank regularization properties of tensor networks, are demonstrated.
arXiv Detail & Related papers (2021-03-27T21:33:46Z) - Temporal Graph Networks for Deep Learning on Dynamic Graphs [4.5158585619109495]
We present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events.
Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient.
arXiv Detail & Related papers (2020-06-18T16:06:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.