Quantum Machine Learning Algorithm for Knowledge Graphs
- URL: http://arxiv.org/abs/2001.01077v2
- Date: Tue, 24 Nov 2020 19:52:43 GMT
- Title: Quantum Machine Learning Algorithm for Knowledge Graphs
- Authors: Yunpu Ma and Volker Tresp
- Abstract summary: Implicit knowledge can be inferred by modeling and reconstructing the tensor representations generated from knowledge graphs.
This paper investigates how quantum resources can be capitalized to accelerate the modeling of knowledge graphs.
- Score: 35.149125599812706
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Semantic knowledge graphs are large-scale triple-oriented databases for
knowledge representation and reasoning. Implicit knowledge can be inferred by
modeling and reconstructing the tensor representations generated from knowledge
graphs. However, as the sizes of knowledge graphs continue to grow, classical
modeling becomes increasingly computational resource intensive. This paper
investigates how quantum resources can be capitalized to accelerate the
modeling of knowledge graphs. In particular, we propose the first quantum
machine learning algorithm for making inference on tensorized data, e.g., on
knowledge graphs. Since most tensor problems are NP-hard, it is challenging to
devise quantum algorithms to support that task. We simplify the problem by
making a plausible assumption that the tensor representation of a knowledge
graph can be approximated by its low-rank tensor singular value decomposition,
which is verified by our experiments. The proposed sampling-based quantum
algorithm achieves exponential speedup with a runtime that is polylogarithmic
in the dimension of knowledge graph tensor.
Related papers
- The Tensor as an Informational Resource [1.3044677039636754]
A tensor is a multidimensional array of numbers that can be used to store data, encode a computational relation and represent quantum entanglement.
We propose a family of information-theoretically constructed preorders on tensors, which can be used to compare tensors with each other and to assess the existence of transformations between them.
arXiv Detail & Related papers (2023-11-03T18:47:39Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - On quantum backpropagation, information reuse, and cheating measurement
collapse [6.476797054353113]
We show that parameterized quantum models can train as efficiently as classical neural networks.
We introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources.
arXiv Detail & Related papers (2023-05-22T18:00:02Z) - Scaling R-GCN Training with Graph Summarization [71.06855946732296]
Training of Relation Graph Convolutional Networks (R-GCN) does not scale well with the size of the graph.
In this work, we experiment with the use of graph summarization techniques to compress the graph.
We obtain reasonable results on the AIFB, MUTAG and AM datasets.
arXiv Detail & Related papers (2022-03-05T00:28:43Z) - From Quantum Graph Computing to Quantum Graph Learning: A Survey [86.8206129053725]
We first elaborate the correlations between quantum mechanics and graph theory to show that quantum computers are able to generate useful solutions.
For its practicability and wide-applicability, we give a brief review of typical graph learning techniques.
We give a snapshot of quantum graph learning where expectations serve as a catalyst for subsequent research.
arXiv Detail & Related papers (2022-02-19T02:56:47Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Graph kernels encoding features of all subgraphs by quantum
superposition [4.031373438192993]
We propose a novel graph kernel that applies a quantum computer to measure the graph similarity taking all subgraphs into account.
For the construction of the quantum kernel, we develop an efficient protocol that removes the index information of subgraphs encoded in the quantum state.
A detailed numerical simulation of a bioinformatics problem is presented to demonstrate that the proposed quantum kernel achieves better classification accuracy than existing graph kernels.
arXiv Detail & Related papers (2021-03-30T05:50:23Z) - Quantum machine learning of graph-structured data [0.38581147665516596]
We consider graph-structured quantum data and describe how to carry out its quantum machine learning via quantum neural networks.
We explain how to systematically exploit this additional graph structure to improve quantum learning algorithms.
arXiv Detail & Related papers (2021-03-19T14:39:19Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.