Learning Product Graphs from Spectral Templates
- URL: http://arxiv.org/abs/2211.02893v1
- Date: Sat, 5 Nov 2022 12:28:11 GMT
- Title: Learning Product Graphs from Spectral Templates
- Authors: Aref Einizade, Sepideh Hajipour Sardouie
- Abstract summary: Graph Learning (GL) is at the core of inference and analysis of connections in data mining and machine learning (ML)
We propose learning product (high dimensional) graphs from product spectral templates with significantly reduced complexity.
In contrast to the rare current approaches, our approach can learn all types of product graphs without knowing the type of graph products.
- Score: 3.04585143845864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Learning (GL) is at the core of inference and analysis of connections
in data mining and machine learning (ML). By observing a dataset of graph
signals, and considering specific assumptions, Graph Signal Processing (GSP)
tools can provide practical constraints in the GL approach. One applicable
constraint can infer a graph with desired frequency signatures, i.e., spectral
templates. However, a severe computational burden is a challenging barrier,
especially for inference from high-dimensional graph signals. To address this
issue and in the case of the underlying graph having graph product structure,
we propose learning product (high dimensional) graphs from product spectral
templates with significantly reduced complexity rather than learning them
directly from high-dimensional graph signals, which, to the best of our
knowledge, has not been addressed in the related areas. In contrast to the rare
current approaches, our approach can learn all types of product graphs (with
more than two graphs) without knowing the type of graph products and has fewer
parameters. Experimental results on both the synthetic and real-world data,
i.e., brain signal analysis and multi-view object images, illustrate
explainable and meaningful factor graphs supported by expert-related research,
as well as outperforming the rare current restricted approaches.
Related papers
- Heterogeneous Graph Contrastive Learning with Spectral Augmentation [15.231689595121553]
This paper introduces a spectral-enhanced graph contrastive learning model (SHCL) for the first time in heterogeneous graph neural networks.
The proposed model learns an adaptive topology augmentation scheme through the heterogeneous graph itself.
Experimental results on multiple real-world datasets demonstrate substantial advantages of the proposed model.
arXiv Detail & Related papers (2024-06-30T14:20:12Z) - Polynomial Graphical Lasso: Learning Edges from Gaussian Graph-Stationary Signals [18.45931641798935]
This paper introduces Polynomial Graphical Lasso (PGL), a new approach to learning graph structures from nodal signals.
Our key contribution lies in the signals as Gaussian and stationary on the graph, enabling the development of a graph-learning lasso.
arXiv Detail & Related papers (2024-04-03T10:19:53Z) - Towards Self-Interpretable Graph-Level Anomaly Detection [73.1152604947837]
Graph-level anomaly detection (GLAD) aims to identify graphs that exhibit notable dissimilarity compared to the majority in a collection.
We propose a Self-Interpretable Graph aNomaly dETection model ( SIGNET) that detects anomalous graphs as well as generates informative explanations simultaneously.
arXiv Detail & Related papers (2023-10-25T10:10:07Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Graph Structure Learning with Variational Information Bottleneck [70.62851953251253]
We propose a novel Variational Information Bottleneck guided Graph Structure Learning framework, namely VIB-GSL.
VIB-GSL learns an informative and compressive graph structure to distill the actionable information for specific downstream tasks.
arXiv Detail & Related papers (2021-12-16T14:22:13Z) - Towards Graph Self-Supervised Learning with Contrastive Adjusted Zooming [48.99614465020678]
We introduce a novel self-supervised graph representation learning algorithm via Graph Contrastive Adjusted Zooming.
This mechanism enables G-Zoom to explore and extract self-supervision signals from a graph from multiple scales.
We have conducted extensive experiments on real-world datasets, and the results demonstrate that our proposed model outperforms state-of-the-art methods consistently.
arXiv Detail & Related papers (2021-11-20T22:45:53Z) - Understanding Coarsening for Embedding Large-Scale Graphs [3.6739949215165164]
Proper analysis of graphs with Machine Learning (ML) algorithms has the potential to yield far-reaching insights into many areas of research and industry.
The irregular structure of graph data constitutes an obstacle for running ML tasks on graphs.
We analyze the impact of the coarsening quality on the embedding performance both in terms of speed and accuracy.
arXiv Detail & Related papers (2020-09-10T15:06:33Z) - Kernel-based Graph Learning from Smooth Signals: A Functional Viewpoint [15.577175610442351]
We propose a novel graph learning framework that incorporates the node-side and observation-side information.
We use graph signals as functions in the reproducing kernel Hilbert space associated with a Kronecker product kernel.
We develop a novel graph-based regularisation method which, when combined with the Kronecker product kernel, enables our model to capture both the dependency explained by the graph and the dependency due to graph signals.
arXiv Detail & Related papers (2020-08-23T16:04:23Z) - Learning Product Graphs Underlying Smooth Graph Signals [15.023662220197242]
This paper devises a method to learn structured graphs from data that are given in the form of product graphs.
To this end, first the graph learning problem is posed as a linear program, which (on average) outperforms the state-of-the-art graph learning algorithms.
arXiv Detail & Related papers (2020-02-26T03:25:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.