Quantum negative sampling strategy for knowledge graph embedding with variational circuit
- URL: http://arxiv.org/abs/2502.17973v1
- Date: Tue, 25 Feb 2025 08:46:27 GMT
- Title: Quantum negative sampling strategy for knowledge graph embedding with variational circuit
- Authors: Pulak Ranjan Giri, Mori Kurokawa, Kazuhiro Saito,
- Abstract summary: A hybrid quantum classical model for knowledge graph embedding has been studied in which a variational quantum circuit is trained.<n>In this article we study such a negative sampling strategy, which exploits quantum superposition, and evaluate the model's performance with a knowledge graph database.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graph is a collection of facts, known as triples(head, relation, tail), which are represented in form of a network, where nodes are entities and edges are relations among the respective head and tail entities. Embedding of knowledge graph for facilitating downstream tasks such as knowledge graph completion, link prediction, recommendation, has been a major area of research recently in classical machine learning. Because the size of knowledge graphs are becoming larger, one of the natural choices is to exploit quantum computing for knowledge graph embedding. Recently, a hybrid quantum classical model for knowledge graph embedding has been studied in which a variational quantum circuit is trained. One of the important aspects in knowledge graph embedding is the sampling of negative triples, which plays a crucial role in efficient training of the model. In classical machine learning various negative sampling strategies have been studied. In quantum knowledge graph embedding model, although we can use these strategies in principle, it is natural to ask if we can exploit quantum advantage in negative sampling. In this article we study such a negative sampling strategy, which exploits quantum superposition, and evaluate the model's performance with a knowledge graph database.
Related papers
- Knowledge Graph Enhanced Generative Multi-modal Models for Class-Incremental Learning [51.0864247376786]
We introduce a Knowledge Graph Enhanced Generative Multi-modal model (KG-GMM) that builds an evolving knowledge graph throughout the learning process.
During testing, we propose a Knowledge Graph Augmented Inference method that locates specific categories by analyzing relationships within the generated text.
arXiv Detail & Related papers (2025-03-24T07:20:43Z) - QGHNN: A quantum graph Hamiltonian neural network [30.632260870411177]
Graph Neural Networks (GNNs) strive to address the challenges posed by complex, high-dimensional graph data.<n>Quantum Neural Networks (QNNs) present a compelling alternative due to their potential for quantum parallelism.<n>This paper introduces a quantum graph Hamiltonian neural network (QGHNN) to enhance graph representation and learning on noisy intermediate-scale quantum computers.
arXiv Detail & Related papers (2025-01-14T10:15:17Z) - A Theory of Link Prediction via Relational Weisfeiler-Leman on Knowledge
Graphs [6.379544211152605]
Graph neural networks are prominent models for representation learning over graph-structured data.
Our goal is to provide a systematic understanding of the landscape of graph neural networks for knowledge graphs.
arXiv Detail & Related papers (2023-02-04T17:40:03Z) - Quantum Graph Learning: Frontiers and Outlook [14.1772249363715]
facilitating quantum theory to enhance graph learning is in its infancy.
We first look at QGL and discuss the mutualism of quantum theory and graph learning.
A new taxonomy of QGL is presented, i.e., quantum computing on graphs, quantum graph representation, and quantum circuits for graph neural networks.
arXiv Detail & Related papers (2023-02-02T05:53:31Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - From Quantum Graph Computing to Quantum Graph Learning: A Survey [86.8206129053725]
We first elaborate the correlations between quantum mechanics and graph theory to show that quantum computers are able to generate useful solutions.
For its practicability and wide-applicability, we give a brief review of typical graph learning techniques.
We give a snapshot of quantum graph learning where expectations serve as a catalyst for subsequent research.
arXiv Detail & Related papers (2022-02-19T02:56:47Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z) - Quantum Machine Learning Algorithm for Knowledge Graphs [35.149125599812706]
Implicit knowledge can be inferred by modeling and reconstructing the tensor representations generated from knowledge graphs.
This paper investigates how quantum resources can be capitalized to accelerate the modeling of knowledge graphs.
arXiv Detail & Related papers (2020-01-04T13:26:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.