Optimization of Bottlenecks in Quantum Graphs Guided by Fiedler Vector-Based Spectral Derivatives
- URL: http://arxiv.org/abs/2506.07875v1
- Date: Mon, 09 Jun 2025 15:46:46 GMT
- Title: Optimization of Bottlenecks in Quantum Graphs Guided by Fiedler Vector-Based Spectral Derivatives
- Authors: John TM Campbell, John Dooley,
- Abstract summary: We discuss the relationships between the Fiedler vector, the Cheeger constant, and threshold behaviors in networks of quantum resource nodes represented as Quantum Directed Acyclic Graphs (QDAGs)<n>We explore how these mathematical constructs can be applied to understand the dynamics of quantum information flow in QDAGs, especially in the context of routing problems with bottlenecks in graph signal processing.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper discusses the relationships between the Fiedler vector, the Cheeger constant, and threshold behaviors in networks of quantum resource nodes represented as Quantum Directed Acyclic Graphs (QDAGs). We explore how these mathematical constructs can be applied to understand the dynamics of quantum information flow in QDAGs, especially in the context of routing problems with bottlenecks in graph signal processing, and how new eigenvalue-based rewiring techniques can optimize entanglement distribution between nodes in a QDAG.
Related papers
- A Spectral Interpretation of Redundancy in a Graph Reservoir [51.40366905583043]
This work revisits the definition of the reservoir in the Multiresolution Reservoir Graph Neural Network (MRGNN)<n>It proposes a variant based on a Fairing algorithm originally introduced in the field of surface design in computer graphics.<n>The core contribution of the paper lies in the theoretical analysis of the algorithm from a random walks perspective.
arXiv Detail & Related papers (2025-07-17T10:02:57Z) - Quantum Graph Convolutional Networks Based on Spectral Methods [10.250921033123152]
Graph Convolutional Networks (GCNs) are specialized neural networks for feature extraction from graph-structured data.<n>This paper introduces an enhancement to GCNs based on spectral methods by integrating quantum computing techniques.
arXiv Detail & Related papers (2025-03-09T05:08:15Z) - A quantum annealing approach to graph node embedding [1.0878040851638]
Node embedding is a key technique for representing graph nodes as vectors while preserving structural and relational properties.<n> classical methods such as DeepWalk, node2vec, and graph convolutional networks learn node embeddings by capturing structural and relational patterns in graphs.<n>Quantum computing provides a promising alternative for graph-based learning by leveraging quantum effects and introducing novel optimization approaches.
arXiv Detail & Related papers (2025-03-08T20:11:55Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - QDC: Quantum Diffusion Convolution Kernels on Graphs [0.0]
Graph convolutional neural networks (GCNs) operate by aggregating messages over local neighborhoods given a prediction task under interest.
We propose a new convolution kernel that effectively rewires the graph according to the occupation correlations of the vertices by trading on the generalized diffusion paradigm for the propagation of a quantum particle over the graph.
arXiv Detail & Related papers (2023-07-20T21:10:54Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Scattering GCN: Overcoming Oversmoothness in Graph Convolutional
Networks [0.0]
Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.
Here, we propose to augment conventional GCNs with geometric scattering transforms and residual convolutions.
The former enables band-pass filtering of graph signals, thus alleviating the so-called oversmoothing often encountered in GCNs.
arXiv Detail & Related papers (2020-03-18T18:03:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.