MEKER: Memory Efficient Knowledge Embedding Representation for Link
Prediction and Question Answering
- URL: http://arxiv.org/abs/2204.10629v1
- Date: Fri, 22 Apr 2022 10:47:03 GMT
- Title: MEKER: Memory Efficient Knowledge Embedding Representation for Link
Prediction and Question Answering
- Authors: Viktoriia Chekalina, Anton Razzhigaev, Albert Sayapin, and Alexander
Panchenko
- Abstract summary: Knowledge Graphs (KGs) are symbolically structured storages of facts.
KG embedding contains concise data used in NLP tasks requiring implicit information about the real world.
We propose a memory-efficient KG embedding model, which yields SOTA-comparable performance on link prediction tasks and KG-based Question Answering.
- Score: 65.62309538202771
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge Graphs (KGs) are symbolically structured storages of facts. The KG
embedding contains concise data used in NLP tasks requiring implicit
information about the real world. Furthermore, the size of KGs that may be
useful in actual NLP assignments is enormous, and creating embedding over it
has memory cost issues. We represent KG as a 3rd-order binary tensor and move
beyond the standard CP decomposition by using a data-specific generalized
version of it. The generalization of the standard CP-ALS algorithm allows
obtaining optimization gradients without a backpropagation mechanism. It
reduces the memory needed in training while providing computational benefits.
We propose a MEKER, a memory-efficient KG embedding model, which yields
SOTA-comparable performance on link prediction tasks and KG-based Question
Answering.
Related papers
- Decoding on Graphs: Faithful and Sound Reasoning on Knowledge Graphs through Generation of Well-Formed Chains [66.55612528039894]
Knowledge Graphs (KGs) can serve as reliable knowledge sources for question answering (QA)
We present DoG (Decoding on Graphs), a novel framework that facilitates a deep synergy between LLMs and KGs.
Experiments across various KGQA tasks with different background KGs demonstrate that DoG achieves superior and robust performance.
arXiv Detail & Related papers (2024-10-24T04:01:40Z) - Learning Rules from KGs Guided by Language Models [48.858741745144044]
Rule learning methods can be applied to predict potentially missing facts.
Ranking of rules is especially challenging over highly incomplete or biased KGs.
With the recent rise of Language Models (LMs) several works have claimed that LMs can be used as alternative means for KG completion.
arXiv Detail & Related papers (2024-09-12T09:27:36Z) - Sharing Parameter by Conjugation for Knowledge Graph Embeddings in Complex Space [11.337803135227752]
A Knowledge Graph (KG) is the directed graphical representation of entities and relations in the real world.
The need to scale up and complete KG automatically yields Knowledge Graph Embedding (KGE)
We propose a parameter-sharing method, i.e., using conjugate parameters for complex numbers employed in KGE models.
Our method improves memory efficiency by 2x in relation embedding while achieving comparable performance to the state-of-the-art non-conjugate models.
arXiv Detail & Related papers (2024-04-18T00:05:02Z) - FedMKGC: Privacy-Preserving Federated Multilingual Knowledge Graph
Completion [21.4302940596294]
Knowledge graph completion (KGC) aims to predict missing facts in knowledge graphs (KGs)
Previous methods that rely on transferring raw data among KGs raise privacy concerns.
We propose a new federated learning framework that implicitly aggregates knowledge from multiple KGs without demanding raw data exchange and entity alignment.
arXiv Detail & Related papers (2023-12-17T08:09:27Z) - KGxBoard: Explainable and Interactive Leaderboard for Evaluation of
Knowledge Graph Completion Models [76.01814380927507]
KGxBoard is an interactive framework for performing fine-grained evaluation on meaningful subsets of the data.
In our experiments, we highlight the findings with the use of KGxBoard, which would have been impossible to detect with standard averaged single-score metrics.
arXiv Detail & Related papers (2022-08-23T15:11:45Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Sequence-to-Sequence Knowledge Graph Completion and Question Answering [8.207403859762044]
We show that an off-the-shelf encoder-decoder Transformer model can serve as a scalable and versatile KGE model.
We achieve this by posing KG link prediction as a sequence-to-sequence task and exchange the triple scoring approach taken by prior KGE methods with autoregressive decoding.
arXiv Detail & Related papers (2022-03-19T13:01:49Z) - NodePiece: Compositional and Parameter-Efficient Representations of
Large Knowledge Graphs [15.289356276538662]
We propose NodePiece, an anchor-based approach to learn a fixed-size entity vocabulary.
In NodePiece, a vocabulary of subword/sub-entity units is constructed from anchor nodes in a graph with known relation types.
Experiments show that NodePiece performs competitively in node classification, link prediction, and relation prediction tasks.
arXiv Detail & Related papers (2021-06-23T03:51:03Z) - Toward Subgraph-Guided Knowledge Graph Question Generation with Graph
Neural Networks [53.58077686470096]
Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers.
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
arXiv Detail & Related papers (2020-04-13T15:43:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.