Global Optimization on Graph-Structured Data via Gaussian Processes with Spectral Representations
- URL: http://arxiv.org/abs/2511.07734v1
- Date: Wed, 12 Nov 2025 01:13:56 GMT
- Title: Global Optimization on Graph-Structured Data via Gaussian Processes with Spectral Representations
- Authors: Shu Hong, Yongsheng Mei, Mahdi Imani, Tian Lan,
- Abstract summary: We introduce a scalable framework for global optimization over graphs.<n>We infer graph structure and node representations through learnable embeddings.<n> Experiments on synthetic and real-world datasets demonstrate that our approach achieves faster convergence and improved optimization performance compared to prior methods.
- Score: 13.634723898878983
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian optimization (BO) is a powerful framework for optimizing expensive black-box objectives, yet extending it to graph-structured domains remains challenging due to the discrete and combinatorial nature of graphs. Existing approaches often rely on either full graph topology-impractical for large or partially observed graphs-or incremental exploration, which can lead to slow convergence. We introduce a scalable framework for global optimization over graphs that employs low-rank spectral representations to build Gaussian process (GP) surrogates from sparse structural observations. The method jointly infers graph structure and node representations through learnable embeddings, enabling efficient global search and principled uncertainty estimation even with limited data. We also provide theoretical analysis establishing conditions for accurate recovery of underlying graph structure under different sampling regimes. Experiments on synthetic and real-world datasets demonstrate that our approach achieves faster convergence and improved optimization performance compared to prior methods.
Related papers
- GADPN: Graph Adaptive Denoising and Perturbation Networks via Singular Value Decomposition [6.24191713518868]
GADPN is a graph structure learning framework that adaptively refines graph topology via low-rank denoising and generalized structural perturbation.<n>It achieves state-of-the-art performance while significantly improving efficiency.<n>It shows particularly strong gains on challenging disassortative graphs, validating its ability to robustly learn enhanced graph structures.
arXiv Detail & Related papers (2026-01-13T05:25:32Z) - Graph Structure Learning with Temporal Graph Information Bottleneck for Inductive Representation Learning [1.3728301825671199]
We propose a versatile framework that integrates Graph Structure Learning (GSL) with Temporal Graph Information Bottleneck (TGIB)<n>We design a novel two-step GSL-based structural enhancer to enrich and optimize node neighborhoods.<n>The TGIB refines the optimized graph by extending the information bottleneck principle to temporal graphs, regularizing both edges and features.
arXiv Detail & Related papers (2025-08-20T17:13:19Z) - Global optimization of graph acquisition functions for neural architecture search [6.266977090949175]
Graph Bayesian optimization has shown potential as a powerful and data-efficient tool for neural architecture search (NAS)<n>This paper presents explicit optimization formulations for graph input space including properties such as reachability and shortest paths.<n>We theoretically prove that the proposed encoding is an equivalent representation of the graph space and provide restrictions for the NAS domain with either node or edge labels.
arXiv Detail & Related papers (2025-05-29T16:46:29Z) - Spectral Clustering for Directed Graphs via Likelihood Estimation on Stochastic Block Models [22.421702511126373]
We leverage statistical inference on block models to guide the development of a spectral clustering algorithm for directed graphs.<n>We establish a theoretical upper bound on the misclustering error of its spectral relaxation, and based on this relaxation, introduce a novel, self-adaptive spectral clustering method for directed graphs.
arXiv Detail & Related papers (2024-03-28T15:47:13Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - A Fast and Robust Method for Global Topological Functional Optimization [70.11080854486953]
We introduce a novel backpropagation scheme that is significantly faster, more stable, and produces more robust optima.
This scheme can also be used to produce a stable visualization of dots in a persistence diagram as a distribution over critical, and near-critical, simplices in the data structure.
arXiv Detail & Related papers (2020-09-17T18:46:16Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.