Deep Graph Matching Consensus
- URL: http://arxiv.org/abs/2001.09621v1
- Date: Mon, 27 Jan 2020 08:05:57 GMT
- Title: Deep Graph Matching Consensus
- Authors: Matthias Fey, Jan E. Lenssen, Christopher Morris, Jonathan Masci, Nils
M. Kriege
- Abstract summary: This work presents a two-stage neural architecture for learning and refining structural correspondences between graphs.
First, we use localized node embeddings computed by a graph neural network to obtain an initial ranking of soft correspondences between nodes.
Secondly, we employ synchronous message passing networks to iteratively re-rank the soft correspondences to reach a matching consensus in local neighborhoods between graphs.
- Score: 19.94426142777885
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work presents a two-stage neural architecture for learning and refining
structural correspondences between graphs. First, we use localized node
embeddings computed by a graph neural network to obtain an initial ranking of
soft correspondences between nodes. Secondly, we employ synchronous message
passing networks to iteratively re-rank the soft correspondences to reach a
matching consensus in local neighborhoods between graphs. We show,
theoretically and empirically, that our message passing scheme computes a
well-founded measure of consensus for corresponding neighborhoods, which is
then used to guide the iterative re-ranking process. Our purely local and
sparsity-aware architecture scales well to large, real-world inputs while still
being able to recover global correspondences consistently. We demonstrate the
practical effectiveness of our method on real-world tasks from the fields of
computer vision and entity alignment between knowledge graphs, on which we
improve upon the current state-of-the-art. Our source code is available under
https://github.com/rusty1s/ deep-graph-matching-consensus.
Related papers
- Know Your Neighborhood: General and Zero-Shot Capable Binary Function Search Powered by Call Graphlets [0.7646713951724013]
This paper proposes a novel graph neural network architecture combined with a novel graph data representation called call graphlets.
A specialized graph neural network model operates on this graph representation, learning to map it to a feature vector that encodes semantic binary code similarities.
Experimental results show that the combination of call graphlets and the novel graph neural network architecture achieves comparable or state-of-the-art performance.
arXiv Detail & Related papers (2024-06-02T18:26:50Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Graph Context Transformation Learning for Progressive Correspondence
Pruning [26.400567961735234]
We propose Graph Context Transformation Network (GCT-Net) enhancing context information to conduct consensus guidance for progressive correspondence pruning.
Specifically, we design the Graph Context Enhance Transformer which first generates the graph network and then transforms it into multi-branch graph contexts.
To further apply the recalibrated graph contexts to the global domain, we propose the Graph Context Guidance Transformer.
arXiv Detail & Related papers (2023-12-26T09:43:30Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - DenseGAP: Graph-Structured Dense Correspondence Learning with Anchor
Points [15.953570826460869]
Establishing dense correspondence between two images is a fundamental computer vision problem.
We introduce DenseGAP, a new solution for efficient Dense correspondence learning with a Graph-structured neural network conditioned on Anchor Points.
Our method advances the state-of-the-art of correspondence learning on most benchmarks.
arXiv Detail & Related papers (2021-12-13T18:59:30Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.