Graph Neural Distance Metric Learning with Graph-Bert
- URL: http://arxiv.org/abs/2002.03427v1
- Date: Sun, 9 Feb 2020 18:58:31 GMT
- Title: Graph Neural Distance Metric Learning with Graph-Bert
- Authors: Jiawei Zhang
- Abstract summary: We will introduce a new graph neural network based distance metric learning approaches, namely GB-DISTANCE (GRAPH-BERT based Neural Distance)
GB-DISTANCE can learn graph representations effectively based on a pre-trained GRAPH-BERT model.
In addition, GB-DISTANCE can also maintain the distance metric basic properties.
- Score: 10.879701971582502
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph distance metric learning serves as the foundation for many graph
learning problems, e.g., graph clustering, graph classification and graph
matching. Existing research works on graph distance metric (or graph kernels)
learning fail to maintain the basic properties of such metrics, e.g.,
non-negative, identity of indiscernibles, symmetry and triangle inequality,
respectively. In this paper, we will introduce a new graph neural network based
distance metric learning approaches, namely GB-DISTANCE (GRAPH-BERT based
Neural Distance). Solely based on the attention mechanism, GB-DISTANCE can
learn graph instance representations effectively based on a pre-trained
GRAPH-BERT model. Different from the existing supervised/unsupervised metrics,
GB-DISTANCE can be learned effectively in a semi-supervised manner. In
addition, GB-DISTANCE can also maintain the distance metric basic properties
mentioned above. Extensive experiments have been done on several benchmark
graph datasets, and the results demonstrate that GB-DISTANCE can out-perform
the existing baseline methods, especially the recent graph neural network model
based graph metrics, with a significant gap in computing the graph distance.
Related papers
- Parametric Graph Representations in the Era of Foundation Models: A Survey and Position [69.48708136448694]
Graphs have been widely used in the past decades of big data and AI to model comprehensive relational data.
Identifying meaningful graph laws can significantly enhance the effectiveness of various applications.
arXiv Detail & Related papers (2024-10-16T00:01:31Z) - Graph Neural Networks with a Distribution of Parametrized Graphs [27.40566674759208]
We introduce latent variables to parameterize and generate multiple graphs.
We obtain the maximum likelihood estimate of the network parameters in an Expectation-Maximization framework.
arXiv Detail & Related papers (2023-10-25T06:38:24Z) - Extended Graph Assessment Metrics for Graph Neural Networks [13.49677006107642]
We introduce extended graph assessment metrics (GAMs) for regression tasks and continuous adjacency matrices.
We show the correlation of these metrics with model performance on different medical population graphs and under different learning settings.
arXiv Detail & Related papers (2023-07-13T13:55:57Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Graph Self-supervised Learning with Accurate Discrepancy Learning [64.69095775258164]
We propose a framework that aims to learn the exact discrepancy between the original and the perturbed graphs, coined as Discrepancy-based Self-supervised LeArning (D-SLA)
We validate our method on various graph-related downstream tasks, including molecular property prediction, protein function prediction, and link prediction tasks, on which our model largely outperforms relevant baselines.
arXiv Detail & Related papers (2022-02-07T08:04:59Z) - Graph Contrastive Learning with Augmentations [109.23158429991298]
We propose a graph contrastive learning (GraphCL) framework for learning unsupervised representations of graph data.
We show that our framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-10-22T20:13:43Z) - Learning Graph Edit Distance by Graph Neural Networks [3.002973807612758]
We propose a new framework able to combine the advances on deep metric learning with traditional approximations of the graph edit distance.
Our method employs a message passing neural network to capture the graph structure, and thus, leveraging this information for its use on a distance computation.
arXiv Detail & Related papers (2020-08-17T21:49:59Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z) - Graph-Bert: Only Attention is Needed for Learning Graph Representations [22.031852733026]
The dominant graph neural networks (GNNs) over-rely on the graph links, causing serious performance problems.
In this paper, we will introduce a new graph neural network, namely GRAPH-BERT (Graph based BERT)
The experimental results have demonstrated that GRAPH-BERT can out-perform the existing GNNs in both the learning effectiveness and efficiency.
arXiv Detail & Related papers (2020-01-15T05:56:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.