Spectral Contraction of Boundary-Weighted Filters on delta-Hyperbolic Graphs
- URL: http://arxiv.org/abs/2506.15464v1
- Date: Wed, 18 Jun 2025 13:55:30 GMT
- Title: Spectral Contraction of Boundary-Weighted Filters on delta-Hyperbolic Graphs
- Authors: Le Vu Anh, Mehmet Dik, Nguyen Viet Anh,
- Abstract summary: Hierarchical graphs often exhibit tree-like branching patterns, a structural property that challenges the design of traditional graph filters.<n>We introduce a boundary-weighted operator that rescales each edge according to how far its endpoints drift toward the graph's Gromov boundary.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hierarchical graphs often exhibit tree-like branching patterns, a structural property that challenges the design of traditional graph filters. We introduce a boundary-weighted operator that rescales each edge according to how far its endpoints drift toward the graph's Gromov boundary. Using Busemann functions on delta-hyperbolic networks, we prove a closed-form upper bound on the operator's spectral norm: every signal loses a curvature-controlled fraction of its energy at each pass. The result delivers a parameter-free, lightweight filter whose stability follows directly from geometric first principles, offering a new analytic tool for graph signal processing on data with dense or hidden hierarchical structure.
Related papers
- Bespoke multiresolution analysis of graph signals [0.0]
We present a novel framework for discrete multiresolution analysis of graph signals.<n>The main analytical tool is the samplet transform, originally defined in the Euclidean framework as a discrete wavelet-like construction.<n>For efficient numerical implementation, we combine heavy edge clustering, to partition the graph into meaningful patches.
arXiv Detail & Related papers (2025-07-25T11:43:19Z) - A Spectral Interpretation of Redundancy in a Graph Reservoir [51.40366905583043]
This work revisits the definition of the reservoir in the Multiresolution Reservoir Graph Neural Network (MRGNN)<n>It proposes a variant based on a Fairing algorithm originally introduced in the field of surface design in computer graphics.<n>The core contribution of the paper lies in the theoretical analysis of the algorithm from a random walks perspective.
arXiv Detail & Related papers (2025-07-17T10:02:57Z) - A General Graph Spectral Wavelet Convolution via Chebyshev Order Decomposition [41.41593198072709]
We present a novel wavelet-based graph convolution network, namely WaveGC, which integrates multi-resolution spectral bases and a matrix-valued filter kernel.<n>Theoretically, we establish that WaveGC can effectively capture and decouple short-range and long-range information, providing superior filtering flexibility.<n>Our numerical experiments showcase the consistent improvements in both short-range and long-range tasks.
arXiv Detail & Related papers (2024-05-22T16:32:27Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Limitless stability for Graph Convolutional Networks [8.1585306387285]
This work establishes rigorous, novel and widely applicable stability guarantees and transferability bounds for graph convolutional networks.
It is showcased that graph convolutional networks are stable under graph-coarse-graining procedures precisely if the GSO is the graph Laplacian and filters are regular at infinity.
arXiv Detail & Related papers (2023-01-26T22:17:00Z) - Graph Spectral Embedding using the Geodesic Betweeness Centrality [76.27138343125985]
We introduce the Graph Sylvester Embedding (GSE), an unsupervised graph representation of local similarity, connectivity, and global structure.
GSE uses the solution of the Sylvester equation to capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2022-05-07T04:11:23Z) - Interpretable Stability Bounds for Spectral Graph Filters [12.590415345079991]
We study filter stability and provide a novel and interpretable upper bound on the change of filter output.
This upper bound allows us to reason, in terms of structural properties of the graph, when a spectral graph filter will be stable.
arXiv Detail & Related papers (2021-02-18T19:25:52Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z) - Spectral Embedding of Graph Networks [76.27138343125985]
We introduce an unsupervised graph embedding that trades off local node similarity and connectivity, and global structure.
The embedding is based on a generalized graph Laplacian, whose eigenvectors compactly capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2020-09-30T04:59:10Z) - From Spectrum Wavelet to Vertex Propagation: Graph Convolutional
Networks Based on Taylor Approximation [85.47548256308515]
Graph convolutional networks (GCN) have been recently utilized to extract the underlying structures of datasets with some labeled data and high-dimensional features.
Existing GCNs mostly rely on a first-order Chebyshev approximation of graph wavelet- Kernels.
arXiv Detail & Related papers (2020-07-01T20:07:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.