uGLAD: Sparse graph recovery by optimizing deep unrolled networks
- URL: http://arxiv.org/abs/2205.11610v1
- Date: Mon, 23 May 2022 20:20:27 GMT
- Title: uGLAD: Sparse graph recovery by optimizing deep unrolled networks
- Authors: Harsh Shrivastava, Urszula Chajewska, Robin Abraham, Xinshi Chen
- Abstract summary: We present a novel technique to perform sparse graph recovery by optimizing deep unrolled networks.
Our model, uGLAD, builds upon and extends the state-of-the-art model GLAD to the unsupervised setting.
We evaluate model results on synthetic Gaussian data, non-Gaussian data generated from Gene Regulatory Networks, and present a case study in anaerobic digestion.
- Score: 11.48281545083889
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Probabilistic Graphical Models (PGMs) are generative models of complex
systems. They rely on conditional independence assumptions between variables to
learn sparse representations which can be visualized in a form of a graph. Such
models are used for domain exploration and structure discovery in poorly
understood domains. This work introduces a novel technique to perform sparse
graph recovery by optimizing deep unrolled networks. Assuming that the input
data $X\in\mathbb{R}^{M\times D}$ comes from an underlying multivariate
Gaussian distribution, we apply a deep model on $X$ that outputs the precision
matrix $\Theta$, which can also be interpreted as the adjacency matrix. Our
model, uGLAD, builds upon and extends the state-of-the-art model GLAD to the
unsupervised setting. The key benefits of our model are (1) uGLAD automatically
optimizes sparsity-related regularization parameters leading to better
performance than existing algorithms. (2) We introduce multi-task learning
based `consensus' strategy for robust handling of missing data in an
unsupervised setting. We evaluate model results on synthetic Gaussian data,
non-Gaussian data generated from Gene Regulatory Networks, and present a case
study in anaerobic digestion.
Related papers
- Learning High-Dimensional Differential Graphs From Multi-Attribute Data [12.94486861344922]
We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure.
Existing methods for differential graph estimation are based on single-attribute (SA) models.
In this paper, we analyze a group lasso penalized D-trace loss function approach for differential graph learning from multi-attribute data.
arXiv Detail & Related papers (2023-12-05T18:54:46Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs [52.52570805621925]
We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
arXiv Detail & Related papers (2022-09-12T04:46:55Z) - Scalable Deep Gaussian Markov Random Fields for General Graphs [14.653008985229615]
We propose a flexible GMRF model for general graphs built on the multi-layer structure of Deep GMRFs.
For a Gaussian likelihood, close to exact Bayesian inference is available for the latent field.
The usefulness of the proposed model is verified by experiments on a number of synthetic and real world datasets.
arXiv Detail & Related papers (2022-06-10T12:12:41Z) - Discovering Invariant Rationales for Graph Neural Networks [104.61908788639052]
Intrinsic interpretability of graph neural networks (GNNs) is to find a small subset of the input graph's features.
We propose a new strategy of discovering invariant rationale (DIR) to construct intrinsically interpretable GNNs.
arXiv Detail & Related papers (2022-01-30T16:43:40Z) - The Deep Generative Decoder: MAP estimation of representations improves
modeling of single-cell RNA data [0.0]
We present a simple generative model that computes model parameters and representations directly via maximum a posteriori (MAP) estimation.
The advantages of this approach are its simplicity and its capability to provide representations of much smaller dimensionality than a comparable VAE.
arXiv Detail & Related papers (2021-10-13T12:17:46Z) - Cauchy-Schwarz Regularized Autoencoder [68.80569889599434]
Variational autoencoders (VAE) are a powerful and widely-used class of generative models.
We introduce a new constrained objective based on the Cauchy-Schwarz divergence, which can be computed analytically for GMMs.
Our objective improves upon variational auto-encoding models in density estimation, unsupervised clustering, semi-supervised learning, and face analysis.
arXiv Detail & Related papers (2021-01-06T17:36:26Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Bipartite Link Prediction based on Topological Features via 2-hop Path [0.8223798883838329]
Linear-Graph Autoencoder(LGAE) has promising performance on challenging tasks such as link prediction and node clustering.
In this paper, we consider the case of bipartite link predictions where node attributes are unavailable.
Our approach consistently outperforms Graph Autoencoder and Linear Graph Autoencoder model in 10 out of 12 bipartite dataset and reaches competitive performances in 2 other bipartite dataset.
arXiv Detail & Related papers (2020-03-19T05:07:54Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Revisiting Graph based Collaborative Filtering: A Linear Residual Graph
Convolutional Network Approach [55.44107800525776]
Graph Convolutional Networks (GCNs) are state-of-the-art graph based representation learning models.
In this paper, we revisit GCN based Collaborative Filtering (CF) based Recommender Systems (RS)
We show that removing non-linearities would enhance recommendation performance, consistent with the theories in simple graph convolutional networks.
We propose a residual network structure that is specifically designed for CF with user-item interaction modeling.
arXiv Detail & Related papers (2020-01-28T04:41:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.