Interferometric Graph Transform: a Deep Unsupervised Graph
Representation
- URL: http://arxiv.org/abs/2006.05722v1
- Date: Wed, 10 Jun 2020 08:27:53 GMT
- Title: Interferometric Graph Transform: a Deep Unsupervised Graph
Representation
- Authors: Edouard Oyallon (MLIA)
- Abstract summary: Interferometric Graph Transform (IGT) is a new class of deep unsupervised graph convolutional neural network for building graph representations.
We show that our learned representation consists of both discriminative and invariant features, thanks to a novel greedy concave objective.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the Interferometric Graph Transform (IGT), which is a new class of
deep unsupervised graph convolutional neural network for building graph
representations. Our first contribution is to propose a generic, complex-valued
spectral graph architecture obtained from a generalization of the Euclidean
Fourier transform. We show that our learned representation consists of both
discriminative and invariant features, thanks to a novel greedy concave
objective. From our experiments, we conclude that our learning procedure
exploits the topology of the spectral domain, which is normally a flaw of
spectral methods, and in particular our method can recover an analytic operator
for vision tasks. We test our algorithm on various and challenging tasks such
as image classification (MNIST, CIFAR-10), community detection (Authorship,
Facebook graph) and action recognition from 3D skeletons videos (SBU, NTU),
exhibiting a new state-of-the-art in spectral graph unsupervised settings.
Related papers
- Spectral Neural Graph Sparsification [0.21932521132244476]
Graphs are central to modeling complex systems in domains such as social networks, molecular chemistry, and neuroscience.<n>We propose the Spectral Preservation Network, a new framework for graph representation learning that generates reduced graphs serving as faithful proxies of the original.<n>We evaluate the effectiveness of Spectral Preservation Network on node-level sparsification by analyzing well-established metrics and benchmarking against state-of-the-art methods.
arXiv Detail & Related papers (2025-10-31T13:51:50Z) - Spectral Graph Reasoning Network for Hyperspectral Image Classification [0.43512163406551996]
Convolutional neural networks (CNNs) have achieved remarkable performance in hyperspectral image (HSI) classification.
We propose a spectral graph reasoning network (SGR) learning framework comprising two crucial modules.
Experiments on two HSI datasets demonstrate that the proposed architecture can significantly improve the classification accuracy.
arXiv Detail & Related papers (2024-07-02T20:29:23Z) - What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding [67.59552859593985]
Graph Transformers, which incorporate self-attention and positional encoding, have emerged as a powerful architecture for various graph learning tasks.
This paper introduces first theoretical investigation of a shallow Graph Transformer for semi-supervised classification.
arXiv Detail & Related papers (2024-06-04T05:30:16Z) - Isomorphic-Consistent Variational Graph Auto-Encoders for Multi-Level
Graph Representation Learning [9.039193854524763]
We propose the Isomorphic-Consistent VGAE (IsoC-VGAE) for task-agnostic graph representation learning.
We first devise a decoding scheme to provide a theoretical guarantee of keeping the isomorphic consistency.
We then propose the Inverse Graph Neural Network (Inv-GNN) decoder as its intuitive realization.
arXiv Detail & Related papers (2023-12-09T10:16:53Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Spectral Augmentation for Self-Supervised Learning on Graphs [43.19199994575821]
Graph contrastive learning (GCL) aims to learn representations via instance discrimination.
It relies on graph augmentation to reflect invariant patterns that are robust to small perturbations.
Recent studies mainly perform topology augmentations in a uniformly random manner in the spatial domain.
We develop spectral augmentation which guides topology augmentations by maximizing the spectral change.
arXiv Detail & Related papers (2022-10-02T22:20:07Z) - Template based Graph Neural Network with Optimal Transport Distances [11.56532171513328]
Current Graph Neural Networks (GNN) architectures rely on two important components: node features embedding through message passing, and aggregation with a specialized form of pooling.
We propose in this work a novel point of view, which places distances to some learnable graph templates at the core of the graph representation.
This distance embedding is constructed thanks to an optimal transport distance: the Fused Gromov-Wasserstein (FGW) distance.
arXiv Detail & Related papers (2022-05-31T12:24:01Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Spectral Embedding of Graph Networks [76.27138343125985]
We introduce an unsupervised graph embedding that trades off local node similarity and connectivity, and global structure.
The embedding is based on a generalized graph Laplacian, whose eigenvectors compactly capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2020-09-30T04:59:10Z) - Bridging the Gap Between Spectral and Spatial Domains in Graph Neural
Networks [8.563354084119062]
We show some equivalence of the graph convolution process regardless it is designed in the spatial or the spectral domain.
The proposed framework is used to design new convolutions in spectral domain with a custom frequency profile while applying them in the spatial domain.
arXiv Detail & Related papers (2020-03-26T01:49:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.