Neural Topological Ordering for Computation Graphs
- URL: http://arxiv.org/abs/2207.05899v1
- Date: Wed, 13 Jul 2022 00:12:02 GMT
- Title: Neural Topological Ordering for Computation Graphs
- Authors: Mukul Gagrani, Corrado Rainone, Yang Yang, Harris Teague, Wonseok
Jeon, Herke Van Hoof, Weiliang Will Zeng, Piero Zappi, Christopher Lott,
Roberto Bondesan
- Abstract summary: We propose an end-to-end machine learning based approach for topological ordering using an encoder-decoder framework.
We show that our model outperforms, or is on-par, with several topological ordering baselines while being significantly faster on synthetic graphs with up to 2k nodes.
- Score: 23.225391263047364
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent works on machine learning for combinatorial optimization have shown
that learning based approaches can outperform heuristic methods in terms of
speed and performance. In this paper, we consider the problem of finding an
optimal topological order on a directed acyclic graph with focus on the memory
minimization problem which arises in compilers. We propose an end-to-end
machine learning based approach for topological ordering using an
encoder-decoder framework. Our encoder is a novel attention based graph neural
network architecture called \emph{Topoformer} which uses different topological
transforms of a DAG for message passing. The node embeddings produced by the
encoder are converted into node priorities which are used by the decoder to
generate a probability distribution over topological orders. We train our model
on a dataset of synthetically generated graphs called layered graphs. We show
that our model outperforms, or is on-par, with several topological ordering
baselines while being significantly faster on synthetic graphs with up to 2k
nodes. We also train and test our model on a set of real-world computation
graphs, showing performance improvements.
Related papers
- Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - UniG-Encoder: A Universal Feature Encoder for Graph and Hypergraph Node
Classification [6.977634174845066]
A universal feature encoder for both graph and hypergraph representation learning is designed, called UniG-Encoder.
The architecture starts with a forward transformation of the topological relationships of connected nodes into edge or hyperedge features.
The encoded node embeddings are then derived from the reversed transformation, described by the transpose of the projection matrix.
arXiv Detail & Related papers (2023-08-03T09:32:50Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Learning to Learn Graph Topologies [27.782971146122218]
We learn a mapping from node data to the graph structure based on the idea of learning to optimise (L2O)
The model is trained in an end-to-end fashion with pairs of node data and graph samples.
Experiments on both synthetic and real-world data demonstrate that our model is more efficient than classic iterative algorithms in learning a graph with specific topological properties.
arXiv Detail & Related papers (2021-10-19T08:42:38Z) - Pyramidal Reservoir Graph Neural Network [18.632681846787246]
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers.
We show how graph pooling can reduce the computational complexity of the model.
Our proposed approach to the design of RC-based GNNs offers an advantageous and principled trade-off between accuracy and complexity.
arXiv Detail & Related papers (2021-04-10T08:34:09Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Heuristic Semi-Supervised Learning for Graph Generation Inspired by
Electoral College [80.67842220664231]
We propose a novel pre-processing technique, namely ELectoral COllege (ELCO), which automatically expands new nodes and edges to refine the label similarity within a dense subgraph.
In all setups tested, our method boosts the average score of base models by a large margin of 4.7 points, as well as consistently outperforms the state-of-the-art.
arXiv Detail & Related papers (2020-06-10T14:48:48Z) - Optimal Transport Graph Neural Networks [31.191844909335963]
Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation.
We introduce OT-GNN, a model that computes graph embeddings using parametric prototypes.
arXiv Detail & Related papers (2020-06-08T14:57:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.