Robust Graph Learning Under Wasserstein Uncertainty
- URL: http://arxiv.org/abs/2105.04210v2
- Date: Thu, 13 May 2021 01:12:32 GMT
- Title: Robust Graph Learning Under Wasserstein Uncertainty
- Authors: Xiang Zhang, Yinfei Xu, Qinghe Liu, Zhicheng Liu, Jian Lu and Qiao
Wang
- Abstract summary: In many scenarios, an accurate graph structure representing signals is not available at all.
We propose a graph learning framework using Wasserstein distributionally robust optimization (WDRO)
We show that our scheme can learn a reliable graph in the context of uncertainty.
- Score: 35.85333465732067
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs are playing a crucial role in different fields since they are powerful
tools to unveil intrinsic relationships among signals. In many scenarios, an
accurate graph structure representing signals is not available at all and that
motivates people to learn a reliable graph structure directly from observed
signals. However, in real life, it is inevitable that there exists uncertainty
in the observed signals due to noise measurements or limited observability,
which causes a reduction in reliability of the learned graph. To this end, we
propose a graph learning framework using Wasserstein distributionally robust
optimization (WDRO) which handles uncertainty in data by defining an
uncertainty set on distributions of the observed data. Specifically, two models
are developed, one of which assumes all distributions in uncertainty set are
Gaussian distributions and the other one has no prior distributional
assumption. Instead of using interior point method directly, we propose two
algorithms to solve the corresponding models and show that our algorithms are
more time-saving. In addition, we also reformulate both two models into
Semi-Definite Programming (SDP), and illustrate that they are intractable in
the scenario of large-scale graph. Experiments on both synthetic and real world
data are carried out to validate the proposed framework, which show that our
scheme can learn a reliable graph in the context of uncertainty.
Related papers
- Learning Latent Graph Structures and their Uncertainty [63.95971478893842]
Graph Neural Networks (GNNs) use relational information as an inductive bias to enhance the model's accuracy.
As task-relevant relations might be unknown, graph structure learning approaches have been proposed to learn them while solving the downstream prediction task.
arXiv Detail & Related papers (2024-05-30T10:49:22Z) - Handling Distribution Shifts on Graphs: An Invariance Perspective [78.31180235269035]
We formulate the OOD problem on graphs and develop a new invariant learning approach, Explore-to-Extrapolate Risk Minimization (EERM)
EERM resorts to multiple context explorers that are adversarially trained to maximize the variance of risks from multiple virtual environments.
We prove the validity of our method by theoretically showing its guarantee of a valid OOD solution.
arXiv Detail & Related papers (2022-02-05T02:31:01Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Distributionally Robust Semi-Supervised Learning Over Graphs [68.29280230284712]
Semi-supervised learning (SSL) over graph-structured data emerges in many network science applications.
To efficiently manage learning over graphs, variants of graph neural networks (GNNs) have been developed recently.
Despite their success in practice, most of existing methods are unable to handle graphs with uncertain nodal attributes.
Challenges also arise due to distributional uncertainties associated with data acquired by noisy measurements.
A distributionally robust learning framework is developed, where the objective is to train models that exhibit quantifiable robustness against perturbations.
arXiv Detail & Related papers (2021-10-20T14:23:54Z) - Learning Graphs from Smooth Signals under Moment Uncertainty [23.868075779606425]
We consider the problem of inferring the graph structure from a given set of graph signals.
Traditional graph learning models do not take this distributional uncertainty into account.
arXiv Detail & Related papers (2021-05-12T06:47:34Z) - Graph Embedding with Data Uncertainty [113.39838145450007]
spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines.
Most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty.
arXiv Detail & Related papers (2020-09-01T15:08:23Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.