Graph Laplacian Learning with Exponential Family Noise
- URL: http://arxiv.org/abs/2306.08201v2
- Date: Tue, 11 Jun 2024 23:52:10 GMT
- Title: Graph Laplacian Learning with Exponential Family Noise
- Authors: Changhao Shi, Gal Mishne,
- Abstract summary: We propose a versatile graph inference framework for learning from graph signals corrupted by exponential family noise.
Our framework generalizes previous methods from continuous smooth graph signals to various data types.
- Score: 8.594140167290098
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph signal processing (GSP) is a prominent framework for analyzing signals on non-Euclidean domains. The graph Fourier transform (GFT) uses the combinatorial graph Laplacian matrix to reveal the spectral decomposition of signals in the graph frequency domain. However, a common challenge in applying GSP methods is that in many scenarios the underlying graph of a system is unknown. A solution in such cases is to construct the unobserved graph from available data, which is commonly referred to as graph or network inference. Although different graph inference methods exist, these are restricted to learning from either smooth graph signals or simple additive Gaussian noise. Other types of noisy data, such as discrete counts or binary digits, are rather common in real-world applications, yet are underexplored in graph inference. In this paper, we propose a versatile graph inference framework for learning from graph signals corrupted by exponential family noise. Our framework generalizes previous methods from continuous smooth graph signals to various data types. We propose an alternating algorithm that jointly estimates the graph Laplacian and the unobserved smooth representation from the noisy signals. We also extend our approach to a variational form to account for the inherent stochasticity of the latent smooth representation. Finally, since real-world graph signals are frequently non-independent and temporally correlated, we further adapt our original setting to a time-vertex formulation. We demonstrate on synthetic and real-world data that our new algorithms outperform competing Laplacian estimation methods that suffer from noise model mismatch.
Related papers
- Joint Signal Recovery and Graph Learning from Incomplete Time-Series [24.308357458676937]
In this work, we aim to learn a graph from incomplete time-series observations.
We propose an algorithm based on the method of block successive upperbound minimization.
Simulation results on synthetic and real time-series demonstrate the performance of the proposed method.
arXiv Detail & Related papers (2023-12-28T10:27:04Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Large Graph Signal Denoising with Application to Differential Privacy [2.867517731896504]
We consider the case of signal denoising on graphs via a data-driven wavelet tight frame methodology.
We make it scalable to large graphs using Chebyshev-Jackson approximations.
A comprehensive performance analysis is carried out on graphs of varying size, from real and simulated data.
arXiv Detail & Related papers (2022-09-05T16:32:54Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation [61.39364567221311]
Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes.
One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-anomalous graphs.
We introduce a novel deep anomaly detection approach for GAD that learns rich global and local normal pattern information by joint random distillation of graph and node representations.
arXiv Detail & Related papers (2021-12-19T05:04:53Z) - Neighborhood Random Walk Graph Sampling for Regularized Bayesian Graph
Convolutional Neural Networks [0.6236890292833384]
In this paper, we propose a novel algorithm called Bayesian Graph Convolutional Network using Neighborhood Random Walk Sampling (BGCN-NRWS)
BGCN-NRWS uses a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm utilizing graph structure, reduces overfitting by using a variational inference layer, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification.
arXiv Detail & Related papers (2021-12-14T20:58:27Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Learning Graphs from Smooth Signals under Moment Uncertainty [23.868075779606425]
We consider the problem of inferring the graph structure from a given set of graph signals.
Traditional graph learning models do not take this distributional uncertainty into account.
arXiv Detail & Related papers (2021-05-12T06:47:34Z) - FiGLearn: Filter and Graph Learning using Optimal Transport [49.428169585114496]
We introduce a novel graph signal processing framework for learning the graph and its generating filter from signal observations.
We show how this framework can be used to infer missing values if only very little information is available.
arXiv Detail & Related papers (2020-10-29T10:00:42Z) - Graph Convolution with Low-rank Learnable Local Filters [32.00396411583352]
This paper introduces a new type of graph convolution with learnable low-rank local filters.
It is provably more expressive than previous spectral graph convolution methods.
The representation against input graph data is theoretically proved, making use of the graph filter locality and the local graph regularization.
arXiv Detail & Related papers (2020-08-04T20:34:59Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.