Graphs, Entities, and Step Mixture
- URL: http://arxiv.org/abs/2005.08485v2
- Date: Wed, 24 Jun 2020 05:46:48 GMT
- Title: Graphs, Entities, and Step Mixture
- Authors: Kyuyong Shin, Wonyoung Shin, Jung-Woo Ha, Sunyoung Kwon
- Abstract summary: We propose a new graph neural network that considers both edge-based neighborhood relationships and node-based entity features.
With intensive experiments, we show that the proposed GESM achieves state-of-the-art or comparable performances on eight benchmark graph datasets.
- Score: 11.162937043309478
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing approaches for graph neural networks commonly suffer from the
oversmoothing issue, regardless of how neighborhoods are aggregated. Most
methods also focus on transductive scenarios for fixed graphs, leading to poor
generalization for unseen graphs. To address these issues, we propose a new
graph neural network that considers both edge-based neighborhood relationships
and node-based entity features, i.e. Graph Entities with Step Mixture via
random walk (GESM). GESM employs a mixture of various steps through random walk
to alleviate the oversmoothing problem, attention to dynamically reflect
interrelations depending on node information, and structure-based
regularization to enhance embedding representation. With intensive experiments,
we show that the proposed GESM achieves state-of-the-art or comparable
performances on eight benchmark graph datasets comprising transductive and
inductive learning tasks. Furthermore, we empirically demonstrate the
significance of considering global information.
Related papers
- Towards Generalizability of Multi-Agent Reinforcement Learning in Graphs with Recurrent Message Passing [0.9353820277714449]
In decentralized approaches, agents operate within a given graph and make decisions based on partial or outdated observations.
This work focuses on generalizability and resolves the trade-off in observed neighborhood size with a continuous information flow in the whole graph.
Our approach can be used in a decentralized manner at runtime and in combination with a reinforcement learning algorithm of choice.
arXiv Detail & Related papers (2024-02-07T16:53:09Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Graph Transformer GANs for Graph-Constrained House Generation [223.739067413952]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The GTGAN learns effective graph node relations in an end-to-end fashion for the challenging graph-constrained house generation task.
arXiv Detail & Related papers (2023-03-14T20:35:45Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - DisenHAN: Disentangled Heterogeneous Graph Attention Network for
Recommendation [11.120241862037911]
Heterogeneous information network has been widely used to alleviate sparsity and cold start problems in recommender systems.
We propose a novel disentangled heterogeneous graph attention network DisenHAN for top-$N$ recommendation.
arXiv Detail & Related papers (2021-06-21T06:26:10Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.