Learning Graph Search Heuristics
- URL: http://arxiv.org/abs/2212.03978v1
- Date: Wed, 7 Dec 2022 22:28:00 GMT
- Title: Learning Graph Search Heuristics
- Authors: Michal P\'andy, Weikang Qiu, Gabriele Corso, Petar Veli\v{c}kovi\'c,
Rex Ying, Jure Leskovec, Pietro Li\`o
- Abstract summary: We present PHIL (Path Heuristic with Imitation Learning), a novel neural architecture and a training algorithm for discovering graph search and navigations from data.
Our function learns graph embeddings useful for inferring node distances, runs in constant time independent of graph sizes, and can be easily incorporated in an algorithm such as A* at test time.
Experiments show that PHIL reduces the number of explored nodes compared to state-of-the-art methods on benchmark datasets by 58.5% on average.
- Score: 48.83557172525969
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Searching for a path between two nodes in a graph is one of the most
well-studied and fundamental problems in computer science. In numerous domains
such as robotics, AI, or biology, practitioners develop search heuristics to
accelerate their pathfinding algorithms. However, it is a laborious and complex
process to hand-design heuristics based on the problem and the structure of a
given use case. Here we present PHIL (Path Heuristic with Imitation Learning),
a novel neural architecture and a training algorithm for discovering graph
search and navigation heuristics from data by leveraging recent advances in
imitation learning and graph representation learning. At training time, we
aggregate datasets of search trajectories and ground-truth shortest path
distances, which we use to train a specialized graph neural network-based
heuristic function using backpropagation through steps of the pathfinding
process. Our heuristic function learns graph embeddings useful for inferring
node distances, runs in constant time independent of graph sizes, and can be
easily incorporated in an algorithm such as A* at test time. Experiments show
that PHIL reduces the number of explored nodes compared to state-of-the-art
methods on benchmark datasets by 58.5\% on average, can be directly applied in
diverse graphs ranging from biological networks to road networks, and allows
for fast planning in time-critical robotics domains.
Related papers
- State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Learning heuristics for A* [9.701208207491879]
We combine recent advancements in Neuralic Reasoning to learn efficient functions for path finding problems on graphs.
We exploit multi-task learning to learn Dijkstra's algorithm and a consistent function for the A* search algorithm.
Results show that running A* over the learnts value can greatly speed up target node searching compared to Dijkstra.
arXiv Detail & Related papers (2022-04-11T10:13:53Z) - Multi-Robot Active Mapping via Neural Bipartite Graph Matching [49.72892929603187]
We study the problem of multi-robot active mapping, which aims for complete scene map construction in minimum time steps.
The key to this problem lies in the goal position estimation to enable more efficient robot movements.
We propose a novel algorithm, namely NeuralCoMapping, which takes advantage of both approaches.
arXiv Detail & Related papers (2022-03-30T14:03:17Z) - Learnable Graph Matching: Incorporating Graph Partitioning with Deep
Feature Learning for Multiple Object Tracking [58.30147362745852]
Data association across frames is at the core of Multiple Object Tracking (MOT) task.
Existing methods mostly ignore the context information among tracklets and intra-frame detections.
We propose a novel learnable graph matching method to address these issues.
arXiv Detail & Related papers (2021-03-30T08:58:45Z) - Persistence Homology for Link Prediction: An Interactive View [15.068319518015421]
Link prediction is an important learning task for graph-structured data.
We propose a novel topological approach to characterize interactions between two nodes.
We also propose a graph neural network method that outperforms state-of-the-arts on different benchmarks.
arXiv Detail & Related papers (2021-02-20T04:33:59Z) - DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural
Networks [45.075163625895286]
We search for a meta graph, which can capture more complex semantic relations than a meta path, to determine how graph neural networks propagate messages along different types of edges.
We design an expressive search space in the form of a directed acyclic graph (DAG) to represent candidate meta graphs for a HIN.
We propose a novel and efficient search algorithm to make the total search cost on a par with training a single GNN once.
arXiv Detail & Related papers (2020-10-07T08:09:29Z) - SIGN: Scalable Inception Graph Neural Networks [4.5158585619109495]
We propose a new, efficient and scalable graph deep learning architecture that sidesteps the need for graph sampling.
Our architecture allows using different local graph operators to best suit the task at hand.
We obtain state-of-the-art results on ogbn-papers100M, the largest public graph dataset, with over 110 million nodes and 1.5 billion edges.
arXiv Detail & Related papers (2020-04-23T14:46:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.