Knowledge Graph Completion with Pre-trained Multimodal Transformer and
  Twins Negative Sampling
        - URL: http://arxiv.org/abs/2209.07084v1
- Date: Thu, 15 Sep 2022 06:50:31 GMT
- Title: Knowledge Graph Completion with Pre-trained Multimodal Transformer and
  Twins Negative Sampling
- Authors: Yichi Zhang, Wen Zhang
- Abstract summary: We propose a VisualBERT-enhanced Knowledge Graph Completion model (VBKGC) for short.
VBKGC could capture deeply fused multimodal information for entities and integrate them into the KGC model.
We conduct extensive experiments to show the outstanding performance of VBKGC on the link prediction task.
- Score: 13.016173217017597
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract:   Knowledge graphs (KGs) that modelings the world knowledge as structural
triples are inevitably incomplete. Such problems still exist for multimodal
knowledge graphs (MMKGs). Thus, knowledge graph completion (KGC) is of great
importance to predict the missing triples in the existing KGs. As for the
existing KGC methods, embedding-based methods rely on manual design to leverage
multimodal information while finetune-based approaches are not superior to
embedding-based methods in link prediction. To address these problems, we
propose a VisualBERT-enhanced Knowledge Graph Completion model (VBKGC for
short). VBKGC could capture deeply fused multimodal information for entities
and integrate them into the KGC model. Besides, we achieve the co-design of the
KGC model and negative sampling by designing a new negative sampling strategy
called twins negative sampling. Twins negative sampling is suitable for
multimodal scenarios and could align different embeddings for entities. We
conduct extensive experiments to show the outstanding performance of VBKGC on
the link prediction task and make further exploration of VBKGC.
 
      
        Related papers
        - Towards Improving Long-Tail Entity Predictions in Temporal Knowledge   Graphs through Global Similarity and Weighted Sampling [53.11315884128402]
 Temporal Knowledge Graph (TKG) completion models traditionally assume access to the entire graph during training.<n>We present an incremental training framework specifically designed for TKGs, aiming to address entities that are either not observed during training or have sparse connections.<n>Our approach combines a model-agnostic enhancement layer with a weighted sampling strategy, that can be augmented to and improve any existing TKG completion method.
 arXiv  Detail & Related papers  (2025-07-25T06:02:48Z)
- MuCo-KGC: Multi-Context-Aware Knowledge Graph Completion [0.0]
 Multi-Context-Aware Knowledge Graph Completion (MuCo-KGC) is a novel model that utilizes contextual information from linked entities and relations within the graph to predict tail entities.
MuCo-KGC eliminates the need for entity descriptions and negative triplet sampling, significantly reducing computational complexity while enhancing performance.
Our experiments on standard datasets, including FB15k-237, WN18RR, CoDEx-S, and CoDEx-M, demonstrate that MuCo-KGC outperforms state-of-the-art methods on three datasets.
 arXiv  Detail & Related papers  (2025-03-05T01:18:11Z)
- Diffusion-based Hierarchical Negative Sampling for Multimodal Knowledge   Graph Completion [6.24078177211832]
 Multimodal Knowledge Graph Completion (MMKGC) aims to address the critical issue of missing knowledge in multimodal knowledge graphs.
Previous approaches ignore the employment of multimodal information to generate diverse and high-quality negative triples.
We propose a novel Diffusion-based Hierarchical Negative Sampling scheme tailored for MMKGC tasks.
 arXiv  Detail & Related papers  (2025-01-26T04:20:34Z)
- Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for   Improved Coverage and Efficiency [59.6772484292295]
 Knowledge graphs (KGs) generated by large language models (LLMs) are increasingly valuable for Retrieval-Augmented Generation (RAG) applications.
Existing KG extraction methods rely on prompt-based approaches, which are inefficient for processing large-scale corpora.
We propose SynthKG, a multi-step, document-level synthesis KG workflow based on LLMs.
We also design a novel graph-based retrieval framework for RAG.
 arXiv  Detail & Related papers  (2024-10-22T00:47:54Z)
- Exploiting Large Language Models Capabilities for Question Answer-Driven   Knowledge Graph Completion Across Static and Temporal Domains [8.472388165833292]
 This paper introduces a new generative completion framework called Generative Subgraph-based KGC (GS-KGC)
GS-KGC employs a question-answering format to directly generate target entities, addressing the challenge of questions having multiple possible answers.
Our method generates negative samples using known facts to facilitate the discovery of new information.
 arXiv  Detail & Related papers  (2024-08-20T13:13:41Z)
- Subgraph-Aware Training of Language Models for Knowledge Graph   Completion Using Structure-Aware Contrastive Learning [4.741342276627672]
 Fine-tuning pre-trained language models (PLMs) has recently shown a potential to improve knowledge graph completion (KGC)
We propose a Subgraph-Aware Training framework for KGC (SATKGC) with two ideas: (i) subgraph-aware mini-batching to encourage hard negative sampling and to mitigate an imbalance in the frequency of entity occurrences during training, and (ii) new contrastive learning to focus more on harder in-batch negative triples and harder positive triples in terms of the structural properties of the knowledge graph.
 arXiv  Detail & Related papers  (2024-07-17T16:25:37Z)
- MyGO: Discrete Modality Information as Fine-Grained Tokens for   Multi-modal Knowledge Graph Completion [51.80447197290866]
 We introduce MyGO to process, fuse, and augment the fine-grained modality information from MMKGs.
MyGO tokenizes multi-modal raw data as fine-grained discrete tokens and learns entity representations with a cross-modal entity encoder.
Experiments on standard MMKGC benchmarks reveal that our method surpasses 20 of the latest models.
 arXiv  Detail & Related papers  (2024-04-15T05:40:41Z)
- Contextualization Distillation from Large Language Model for Knowledge
  Graph Completion [51.126166442122546]
 We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
 Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
 arXiv  Detail & Related papers  (2024-01-28T08:56:49Z)
- Few-Shot Inductive Learning on Temporal Knowledge Graphs using
  Concept-Aware Information [31.10140298420744]
 We propose a few-shot out-of-graph (OOG) link prediction task for temporal knowledge graphs (TKGs)
We predict the missing entities from the links concerning unseen entities by employing a meta-learning framework.
Our model achieves superior performance on all three datasets.
 arXiv  Detail & Related papers  (2022-11-15T14:23:07Z)
- Unified Multi-View Orthonormal Non-Negative Graph Based Clustering
  Framework [74.25493157757943]
 We formulate a novel clustering model, which exploits the non-negative feature property and incorporates the multi-view information into a unified joint learning framework.
We also explore, for the first time, the multi-model non-negative graph-based approach to clustering data based on deep features.
 arXiv  Detail & Related papers  (2022-11-03T08:18:27Z)
- KGxBoard: Explainable and Interactive Leaderboard for Evaluation of
  Knowledge Graph Completion Models [76.01814380927507]
 KGxBoard is an interactive framework for performing fine-grained evaluation on meaningful subsets of the data.
In our experiments, we highlight the findings with the use of KGxBoard, which would have been impossible to detect with standard averaged single-score metrics.
 arXiv  Detail & Related papers  (2022-08-23T15:11:45Z)
- GreenKGC: A Lightweight Knowledge Graph Completion Method [32.528770408502396]
 GreenKGC aims to discover missing relationships between entities in knowledge graphs.
It consists of three modules: representation learning, feature pruning, and decision learning.
In low dimensions, GreenKGC can outperform SOTA methods in most datasets.
 arXiv  Detail & Related papers  (2022-08-19T03:33:45Z)
- KGBoost: A Classification-based Knowledge Base Completion Method with
  Negative Sampling [29.14178162494542]
 KGBoost is a new method to train a powerful classifier for missing link prediction.
We conduct experiments on multiple benchmark datasets, and demonstrate that KGBoost outperforms state-of-the-art methods across most datasets.
As compared with models trained by end-to-end optimization, KGBoost works well under the low-dimensional setting so as to allow a smaller model size.
 arXiv  Detail & Related papers  (2021-12-17T06:19:37Z)
- Cross-Modal Collaborative Representation Learning and a Large-Scale RGBT
  Benchmark for Crowd Counting [109.32927895352685]
 We introduce a large-scale RGBT Crowd Counting (RGBT-CC) benchmark, which contains 2,030 pairs of RGB-thermal images with 138,389 annotated people.
To facilitate the multimodal crowd counting, we propose a cross-modal collaborative representation learning framework.
Experiments conducted on the RGBT-CC benchmark demonstrate the effectiveness of our framework for RGBT crowd counting.
 arXiv  Detail & Related papers  (2020-12-08T16:18:29Z)
- KACC: A Multi-task Benchmark for Knowledge Abstraction, Concretization
  and Completion [99.47414073164656]
 A comprehensive knowledge graph (KG) contains an instance-level entity graph and an ontology-level concept graph.
The two-view KG provides a testbed for models to "simulate" human's abilities on knowledge abstraction, concretization, and completion.
We propose a unified KG benchmark by improving existing benchmarks in terms of dataset scale, task coverage, and difficulty.
 arXiv  Detail & Related papers  (2020-04-28T16:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
       
     
           This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.