Graph Neural Networks for Motion Planning
- URL: http://arxiv.org/abs/2006.06248v2
- Date: Mon, 14 Dec 2020 17:07:58 GMT
- Title: Graph Neural Networks for Motion Planning
- Authors: Arbaaz Khan, Alejandro Ribeiro, Vijay Kumar, Anthony G. Francis
- Abstract summary: We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
- Score: 108.51253840181677
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper investigates the feasibility of using Graph Neural Networks (GNNs)
for classical motion planning problems. We propose guiding both continuous and
discrete planning algorithms using GNNs' ability to robustly encode the
topology of the planning space using a property called permutation invariance.
We present two techniques, GNNs over dense fixed graphs for low-dimensional
problems and sampling-based GNNs for high-dimensional problems. We examine the
ability of a GNN to tackle planning problems such as identifying critical nodes
or learning the sampling distribution in Rapidly-exploring Random Trees (RRT).
Experiments with critical sampling, a pendulum and a six DoF robot arm show
GNNs improve on traditional analytic methods as well as learning approaches
using fully-connected or convolutional neural networks.
Related papers
- Unleash Graph Neural Networks from Heavy Tuning [33.948899558876604]
Graph Neural Networks (GNNs) are deep-learning architectures designed for graph-type data.
We propose a graph conditional latent diffusion framework (GNN-Diff) to generate high-performing GNNs directly by learning from checkpoints saved during a light-tuning coarse search.
arXiv Detail & Related papers (2024-05-21T06:23:47Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - GNN-Ensemble: Towards Random Decision Graph Neural Networks [3.7620848582312405]
Graph Neural Networks (GNNs) have enjoyed wide spread applications in graph-structured data.
GNNs are required to learn latent patterns from a limited amount of training data to perform inferences on a vast amount of test data.
In this paper, we push one step forward on the ensemble learning of GNNs with improved accuracy, robustness, and adversarial attacks.
arXiv Detail & Related papers (2023-03-20T18:24:01Z) - Two-level Graph Neural Network [15.014364222532347]
We propose a novel GNN framework, referred to as the Two-level GNN (TL-GNN)
This merges subgraph-level information with node-level information.
Experiments show that TL-GNN outperforms existing GNNs and achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-01-03T02:15:20Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Learning Graph Neural Networks with Approximate Gradient Descent [24.49427608361397]
Two types of graph neural networks (GNNs) are investigated, depending on whether labels are attached to nodes or graphs.
A comprehensive framework for designing and analyzing convergence of GNN training algorithms is developed.
The proposed algorithm guarantees a linear convergence rate to the underlying true parameters of GNNs.
arXiv Detail & Related papers (2020-12-07T02:54:48Z) - GNNLens: A Visual Analytics Approach for Prediction Error Diagnosis of
Graph Neural Networks [42.222552078920216]
Graph Neural Networks (GNNs) aim to extend deep learning techniques to graph data.
GNNs behave like a black box with their details hidden from model developers and users.
It is therefore difficult to diagnose possible errors of GNNs.
This paper fills the research gap with an interactive visual analysis tool, GNNLens, to assist model developers and users in understanding and analyzing GNNs.
arXiv Detail & Related papers (2020-11-22T16:09:08Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Stochastic Graph Neural Networks [123.39024384275054]
Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning.
Current GNN architectures assume ideal scenarios and ignore link fluctuations that occur due to environment, human factors, or external attacks.
In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly.
arXiv Detail & Related papers (2020-06-04T08:00:00Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.