Learning Product Graphs Underlying Smooth Graph Signals
- URL: http://arxiv.org/abs/2002.11277v2
- Date: Fri, 12 Jun 2020 21:16:03 GMT
- Title: Learning Product Graphs Underlying Smooth Graph Signals
- Authors: Muhammad Asad Lodhi and Waheed U. Bajwa
- Abstract summary: This paper devises a method to learn structured graphs from data that are given in the form of product graphs.
To this end, first the graph learning problem is posed as a linear program, which (on average) outperforms the state-of-the-art graph learning algorithms.
- Score: 15.023662220197242
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world data is often times associated with irregular structures that can
analytically be represented as graphs. Having access to this graph, which is
sometimes trivially evident from domain knowledge, provides a better
representation of the data and facilitates various information processing
tasks. However, in cases where the underlying graph is unavailable, it needs to
be learned from the data itself for data representation, data processing and
inference purposes. Existing literature on learning graphs from data has mostly
considered arbitrary graphs, whereas the graphs generating real-world data tend
to have additional structure that can be incorporated in the graph learning
procedure. Structure-aware graph learning methods require learning fewer
parameters and have the potential to reduce computational, memory and sample
complexities. In light of this, the focus of this paper is to devise a method
to learn structured graphs from data that are given in the form of product
graphs. Product graphs arise naturally in many real-world datasets and provide
an efficient and compact representation of large-scale graphs through several
smaller factor graphs. To this end, first the graph learning problem is posed
as a linear program, which (on average) outperforms the state-of-the-art graph
learning algorithms. This formulation is of independent interest itself as it
shows that graph learning is possible through a simple linear program.
Afterwards, an alternating minimization-based algorithm aimed at learning
various types of product graphs is proposed, and local convergence guarantees
to the true solution are established for this algorithm. Finally the
performance gains, reduced sample complexity, and inference capabilities of the
proposed algorithm over existing methods are also validated through numerical
simulations on synthetic and real datasets.
Related papers
- Joint Signal Recovery and Graph Learning from Incomplete Time-Series [24.308357458676937]
In this work, we aim to learn a graph from incomplete time-series observations.
We propose an algorithm based on the method of block successive upperbound minimization.
Simulation results on synthetic and real time-series demonstrate the performance of the proposed method.
arXiv Detail & Related papers (2023-12-28T10:27:04Z) - The Graph Lottery Ticket Hypothesis: Finding Sparse, Informative Graph
Structure [18.00833762891405]
Graph Lottery Ticket (GLT) Hypothesis: There is an extremely sparse backbone for every graph.
We study 8 key metrics of interest that directly influence the performance of graph learning algorithms.
We propose a straightforward and efficient algorithm for finding these GLTs in arbitrary graphs.
arXiv Detail & Related papers (2023-12-08T00:24:44Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Heterogeneous Graph Sparsification for Efficient Representation Learning [15.075580975708654]
We develop sampling-based algorithms for constructing sparsifiers that are provably sparse and preserve important information in the original graphs.
We have performed extensive experiments to confirm that the proposed method can improve time and space complexities of representation learning while achieving comparable, or even better performance in subsequent graph learning tasks.
arXiv Detail & Related papers (2022-11-14T16:47:06Z) - Synthetic Graph Generation to Benchmark Graph Learning [7.914804101579097]
Graph learning algorithms have attained state-of-the-art performance on many graph analysis tasks.
One reason is due to the very small number of datasets used in practice to benchmark the performance of graph learning algorithms.
We propose to generate synthetic graphs, and study the behaviour of graph learning algorithms in a controlled scenario.
arXiv Detail & Related papers (2022-04-04T10:48:32Z) - Product Graph Learning from Multi-domain Data with Sparsity and Rank
Constraints [17.15829643665034]
We propose an efficient iterative solver for learning sparse product graphs from data.
We extend this solver to infer multi-component graph factors with applications to product graph clustering.
The efficacy of the developed framework is demonstrated using several numerical experiments on synthetic data and real data.
arXiv Detail & Related papers (2020-12-15T04:59:32Z) - GraphOpt: Learning Optimization Models of Graph Formation [72.75384705298303]
We propose an end-to-end framework that learns an implicit model of graph structure formation and discovers an underlying optimization mechanism.
The learned objective can serve as an explanation for the observed graph properties, thereby lending itself to transfer across different graphs within a domain.
GraphOpt poses link formation in graphs as a sequential decision-making process and solves it using maximum entropy inverse reinforcement learning algorithm.
arXiv Detail & Related papers (2020-07-07T16:51:39Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.