Quantum Simplicial Neural Networks
- URL: http://arxiv.org/abs/2501.05558v1
- Date: Thu, 09 Jan 2025 20:07:25 GMT
- Title: Quantum Simplicial Neural Networks
- Authors: Simone Piperno, Claudio Battiloro, Andrea Ceschini, Francesca Dominici, Paolo Di Lorenzo, Massimo Panella,
- Abstract summary: We present the first Quantum Topological Deep Learning Model: Quantum Simplicial Networks (QSNs)<n>QSNs are a stack of Quantum Simplicial Layers, which are inspired by the Ising model to encode higher-order structures into quantum states.<n> Experiments on synthetic classification tasks show that QSNs can outperform classical simplicial TDL models in accuracy and efficiency.
- Score: 11.758402121933996
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) excel at learning from graph-structured data but are limited to modeling pairwise interactions, insufficient for capturing higher-order relationships present in many real-world systems. Topological Deep Learning (TDL) has allowed for systematic modeling of hierarchical higher-order interactions by relying on combinatorial topological spaces such as simplicial complexes. In parallel, Quantum Neural Networks (QNNs) have been introduced to leverage quantum mechanics for enhanced computational and learning power. In this work, we present the first Quantum Topological Deep Learning Model: Quantum Simplicial Networks (QSNs), being QNNs operating on simplicial complexes. QSNs are a stack of Quantum Simplicial Layers, which are inspired by the Ising model to encode higher-order structures into quantum states. Experiments on synthetic classification tasks show that QSNs can outperform classical simplicial TDL models in accuracy and efficiency, demonstrating the potential of combining quantum computing with TDL for processing data on combinatorial topological spaces.
Related papers
- LCQNN: Linear Combination of Quantum Neural Networks [7.010027035873597]
We introduce the Linear Combination of Quantum Neural Networks (LCQNN) framework, which uses the linear combination of unitaries concept to create a tunable design.<n>We show how specific structural choices, such as adopting $k$ of control unitaries or restricting the model to certain group-theoretic subspaces, prevent gradients from collapsing.<n>In group action scenarios, we show that by exploiting symmetry and excluding exponentially large irreducible subspaces, the model circumvents barren plateaus.
arXiv Detail & Related papers (2025-07-03T17:43:10Z) - Quantum Recurrent Embedding Neural Network [11.54075064463256]
We propose a quantum recurrent embedding neural network (QRENN) inspired by fast-track information pathways in ResNet.<n>We provide a rigorous proof of the trainability of QRENN circuits, demonstrating that this deep quantum neural network can avoid barren plateaus.<n>Our results highlight the power of recurrent data embedding in quantum neural networks and the potential for scalable quantum supervised learning.
arXiv Detail & Related papers (2025-06-16T07:50:31Z) - Inductive Graph Representation Learning with Quantum Graph Neural Networks [0.40964539027092917]
Quantum Graph Neural Networks (QGNNs) present a promising approach for combining quantum computing with graph-structured data processing.
We propose a versatile QGNN framework inspired by the classical GraphSAGE approach, utilizing quantum models as aggregators.
We show that our quantum approach exhibits robust generalization across molecules with varying numbers of atoms without requiring circuit modifications.
arXiv Detail & Related papers (2025-03-31T14:04:08Z) - Let the Quantum Creep In: Designing Quantum Neural Network Models by
Gradually Swapping Out Classical Components [1.024113475677323]
Modern AI systems are often built on neural networks.
We propose a framework where classical neural network layers are gradually replaced by quantum layers.
We conduct numerical experiments on image classification datasets to demonstrate the change of performance brought by the systematic introduction of quantum components.
arXiv Detail & Related papers (2024-09-26T07:01:29Z) - CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - Variational Quantum Neural Networks (VQNNS) in Image Classification [0.0]
This paper investigates how training of quantum neural network (QNNs) can be done using quantum optimization algorithms.
In this paper, a QNN structure is made where a variational parameterized circuit is incorporated as an input layer named as Variational Quantum Neural Network (VQNNs)
VQNNs is experimented with MNIST digit recognition (less complex) and crack image classification datasets which converge the computation in lesser time than QNN with decent training accuracy.
arXiv Detail & Related papers (2023-03-10T11:24:32Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Quantum Self-Attention Neural Networks for Text Classification [8.975913540662441]
We propose a new simple network architecture, called the quantum self-attention neural network (QSANN)
We introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention.
Our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.
arXiv Detail & Related papers (2022-05-11T16:50:46Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Branching Quantum Convolutional Neural Networks [0.0]
Small-scale quantum computers are already showing potential gains in learning tasks on large quantum and very large classical data sets.
We present a generalization of QCNN, the branching quantum convolutional neural network, or bQCNN, with substantially higher expressibility.
arXiv Detail & Related papers (2020-12-28T19:00:03Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Entanglement Classification via Neural Network Quantum States [58.720142291102135]
In this paper we combine machine-learning tools and the theory of quantum entanglement to perform entanglement classification for multipartite qubit systems in pure states.
We use a parameterisation of quantum systems using artificial neural networks in a restricted Boltzmann machine (RBM) architecture, known as Neural Network Quantum States (NNS)
arXiv Detail & Related papers (2019-12-31T07:40:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.