DisenE: Disentangling Knowledge Graph Embeddings
- URL: http://arxiv.org/abs/2010.14730v2
- Date: Thu, 12 Nov 2020 12:53:03 GMT
- Title: DisenE: Disentangling Knowledge Graph Embeddings
- Authors: Xiaoyu Kou, Yankai Lin, Yuntao Li, Jiahao Xu, Peng Li, Jie Zhou, Yan
Zhang
- Abstract summary: DisenE is an end-to-end framework to learn disentangled knowledge graph embeddings.
We introduce an attention-based mechanism that enables the model to explicitly focus on relevant components of entity embeddings according to a given relation.
- Score: 33.169388832519
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graph embedding (KGE), aiming to embed entities and relations into
low-dimensional vectors, has attracted wide attention recently. However, the
existing research is mainly based on the black-box neural models, which makes
it difficult to interpret the learned representation. In this paper, we
introduce DisenE, an end-to-end framework to learn disentangled knowledge graph
embeddings. Specially, we introduce an attention-based mechanism that enables
the model to explicitly focus on relevant components of entity embeddings
according to a given relation. Furthermore, we introduce two novel regularizers
to encourage each component of the entity representation to independently
reflect an isolated semantic aspect. Experimental results demonstrate that our
proposed DisenE investigates a perspective to address the interpretability of
KGE and is proved to be an effective way to improve the performance of link
prediction tasks.
Related papers
- Explainable Representations for Relation Prediction in Knowledge Graphs [0.0]
We propose SEEK, a novel approach for explainable representations to support relation prediction in knowledge graphs.
It is based on identifying relevant shared semantic aspects between entities and learning representations for each subgraph.
We evaluate SEEK on two real-world relation prediction tasks: protein-protein interaction prediction and gene-disease association prediction.
arXiv Detail & Related papers (2023-06-22T06:18:40Z) - Message Intercommunication for Inductive Relation Reasoning [49.731293143079455]
We develop a novel inductive relation reasoning model called MINES.
We introduce a Message Intercommunication mechanism on the Neighbor-Enhanced Subgraph.
Our experiments show that MINES outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2023-05-23T13:51:46Z) - Knowledge Graph Completion with Counterfactual Augmentation [23.20561746976504]
We introduce a counterfactual question: "would the relation still exist if the neighborhood of entities became different from observation?"
With a carefully designed instantiation of a causal model on the knowledge graph, we generate the counterfactual relations to answer the question.
We incorporate the created counterfactual relations with the GNN-based framework on KGs to augment their learning of entity pair representations.
arXiv Detail & Related papers (2023-02-25T14:08:15Z) - Learning Attention-based Representations from Multiple Patterns for
Relation Prediction in Knowledge Graphs [2.4028383570062606]
AEMP is a novel model for learning contextualized representations by acquiring entities' context information.
AEMP either outperforms or competes with state-of-the-art relation prediction methods.
arXiv Detail & Related papers (2022-06-07T10:53:35Z) - MINER: Improving Out-of-Vocabulary Named Entity Recognition from an
Information Theoretic Perspective [57.19660234992812]
NER model has achieved promising performance on standard NER benchmarks.
Recent studies show that previous approaches may over-rely on entity mention information, resulting in poor performance on out-of-vocabulary (OOV) entity recognition.
We propose MINER, a novel NER learning framework, to remedy this issue from an information-theoretic perspective.
arXiv Detail & Related papers (2022-04-09T05:18:20Z) - Jointly Learning Knowledge Embedding and Neighborhood Consensus with
Relational Knowledge Distillation for Entity Alignment [9.701081498310165]
Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs.
Recent studies employ embedding-based methods by first learning representation of Knowledge Graphs and then performing entity alignment.
We propose a Graph Convolutional Network (GCN) model equipped with knowledge distillation for entity alignment.
arXiv Detail & Related papers (2022-01-25T02:47:14Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.