Local Neighbor Propagation Embedding
- URL: http://arxiv.org/abs/2006.16009v1
- Date: Mon, 29 Jun 2020 12:49:22 GMT
- Title: Local Neighbor Propagation Embedding
- Authors: Shenglan Liu and Yang Yu
- Abstract summary: We introduce neighbor propagation into Local Neighbor propagation Embedding (LNPE)
LNPE enhances the local connections and interactions between neighborhoods by extending $1$-hop neighbors into $n$-hop neighbors.
Experiments show that LNPE could obtain more faithful and robust embeddings with better topological and geometrical properties.
- Score: 10.120548476934186
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Manifold Learning occupies a vital role in the field of nonlinear
dimensionality reduction and its ideas also serve for other relevant methods.
Graph-based methods such as Graph Convolutional Networks (GCN) show ideas in
common with manifold learning, although they belong to different fields.
Inspired by GCN, we introduce neighbor propagation into LLE and propose Local
Neighbor Propagation Embedding (LNPE). With linear computational complexity
increase compared with LLE, LNPE enhances the local connections and
interactions between neighborhoods by extending $1$-hop neighbors into $n$-hop
neighbors. The experimental results show that LNPE could obtain more faithful
and robust embeddings with better topological and geometrical properties.
Related papers
- LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity [59.41119013018377]
We propose to use the local similarity (LocalSim) to learn node-level weighted fusion, which can also serve as a plug-and-play module.
For better fusion, we propose a novel and efficient Initial Residual Difference Connection (IRDC) to extract more informative multi-hop information.
Our proposed method, namely Local Similarity Graph Neural Network (LSGNN), can offer comparable or superior state-of-the-art performance on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2023-05-07T09:06:11Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Contrastive Adaptive Propagation Graph Neural Networks for Efficient
Graph Learning [65.08818785032719]
Graph Networks (GNNs) have achieved great success in processing graph data by extracting and propagating structure-aware features.
Recently the field has advanced from local propagation schemes that focus on local neighbors towards extended propagation schemes that can directly deal with extended neighbors consisting of both local and high-order neighbors.
Despite the impressive performance, existing approaches are still insufficient to build an efficient and learnable extended propagation scheme that can adaptively adjust the influence of local and high-order neighbors.
arXiv Detail & Related papers (2021-12-02T10:35:33Z) - On Representation Knowledge Distillation for Graph Neural Networks [15.82821940784549]
We study whether preserving the global topology of how the teacher embeds graph data can be a more effective distillation objective for GNNs.
We propose two new approaches which better preserve global topology: (1) Global Structure Preserving loss (GSP) and (2) Graph Contrastive Representation Distillation (G-CRD)
arXiv Detail & Related papers (2021-11-09T06:22:27Z) - An Entropy-guided Reinforced Partial Convolutional Network for Zero-Shot
Learning [77.72330187258498]
We propose a novel Entropy-guided Reinforced Partial Convolutional Network (ERPCNet)
ERPCNet extracts and aggregates localities based on semantic relevance and visual correlations without human-annotated regions.
It not only discovers global-cooperative localities dynamically but also converges faster for policy gradient optimization.
arXiv Detail & Related papers (2021-11-03T11:13:13Z) - RMNA: A Neighbor Aggregation-Based Knowledge Graph Representation
Learning Model Using Rule Mining [9.702290899930608]
Neighbor aggregation-based representation learning (NARL) models are proposed, which encode the information in the neighbors of an entity into its embeddings.
We propose a NARL model named RMNA, which obtains and filters horn rules through a rule mining algorithm, and uses selected horn rules to transform valuable multi-hop neighbors into one-hop neighbors.
arXiv Detail & Related papers (2021-11-01T02:08:26Z) - Tree Decomposed Graph Neural Network [11.524511007436791]
We propose a tree decomposition method to disentangle neighborhoods in different layers to alleviate feature smoothing.
We also characterize the multi-hop dependency via graph diffusion within our tree decomposition formulation to construct Tree Decomposed Graph Neural Network (TDGNN)
Comprehensive experiments demonstrate the superior performance of TDGNN on both homophily and heterophily networks.
arXiv Detail & Related papers (2021-08-25T02:47:16Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Neighborhood Matching Network for Entity Alignment [71.24217694278616]
Neighborhood Matching Network (NMN) is a novel entity alignment framework.
NMN estimates the similarities between entities to capture both the topological structure and the neighborhood difference.
It first uses a novel graph sampling method to distill a discriminative neighborhood for each entity.
It then adopts a cross-graph neighborhood matching module to jointly encode the neighborhood difference for a given entity pair.
arXiv Detail & Related papers (2020-05-12T08:26:15Z) - Empirical Studies on the Properties of Linear Regions in Deep Neural
Networks [34.08593191989188]
A deep neural network (DNN) with piecewise linear activations can partition the input space into numerous small linear regions.
It is believed that the number of these regions represents the expressivity of the DNN.
We study their local properties, such as the inspheres, the directions of the corresponding hyperplanes, the decision boundaries, and the relevance of the surrounding regions.
arXiv Detail & Related papers (2020-01-04T12:47:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.