Signed Graph Learning: Algorithms and Theory
- URL: http://arxiv.org/abs/2507.09717v1
- Date: Sun, 13 Jul 2025 17:33:26 GMT
- Title: Signed Graph Learning: Algorithms and Theory
- Authors: Abdullah Karaaslanli, Bisakh Banerjee, Tapabrata Maiti, Selin Aviyente,
- Abstract summary: Real-world data is often represented through the relationships between data samples, forming a graph structure.<n>Current graph learning research has primarily focused on unsigned graphs.<n>We develop a method for capturing a set of smooth signed graph signals.
- Score: 17.374356596021936
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Real-world data is often represented through the relationships between data samples, forming a graph structure. In many applications, it is necessary to learn this graph structure from the observed data. Current graph learning research has primarily focused on unsigned graphs, which consist only of positive edges. However, many biological and social systems are better described by signed graphs that account for both positive and negative interactions, capturing similarity and dissimilarity between samples. In this paper, we develop a method for learning signed graphs from a set of smooth signed graph signals. Specifically, we employ the net Laplacian as a graph shift operator (GSO) to define smooth signed graph signals as the outputs of a low-pass signed graph filter defined by the net Laplacian. The signed graph is then learned by formulating a non-convex optimization problem where the total variation of the observed signals is minimized with respect to the net Laplacian. The proposed problem is solved using alternating direction method of multipliers (ADMM) and a fast algorithm reducing the per-ADMM iteration complexity from quadratic to linear in the number of nodes is introduced. Furthermore, theoretical proofs of convergence for the algorithm and a bound on the estimation error of the learned net Laplacian as a function of sample size, number of nodes, and graph topology are provided. Finally, the proposed method is evaluated on simulated data and gene regulatory network inference problem and compared to existing signed graph learning methods.
Related papers
- Online Proximal ADMM for Graph Learning from Streaming Smooth Signals [9.34612743192798]
We develop a novel algorithm for online graph learning using observation streams, assumed to be smooth on the latent graph.
Our modus operandi is to process graph signals sequentially and thus keep memory and computational costs in check.
The proposed approach also exhibits better tracking performance (in terms of suboptimality) when compared to state-of-the-art online graph learning baselines.
arXiv Detail & Related papers (2024-09-19T17:12:03Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph Laplacian Learning with Exponential Family Noise [8.594140167290098]
We propose a versatile graph inference framework for learning from graph signals corrupted by exponential family noise.
Our framework generalizes previous methods from continuous smooth graph signals to various data types.
arXiv Detail & Related papers (2023-06-14T02:09:52Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Signed Graph Neural Networks: A Frequency Perspective [14.386571627652975]
Graph convolutional networks (GCNs) are designed for unsigned graphs containing only positive links.
We propose two different signed graph neural networks, one keeps only low-frequency information and one also retains high-frequency information.
arXiv Detail & Related papers (2022-08-15T16:42:18Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Graphon based Clustering and Testing of Networks: Algorithms and Theory [11.3700474413248]
Network-valued data are encountered in a wide range of applications and pose challenges in learning.
We present two clustering algorithms that achieve state-of-the-art results.
We further study the applicability of the proposed distance for graph two-sample testing problems.
arXiv Detail & Related papers (2021-10-06T13:14:44Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.