FuDGE: A Method to Estimate a Functional Differential Graph in a
High-Dimensional Setting
- URL: http://arxiv.org/abs/2003.05402v4
- Date: Fri, 1 Apr 2022 15:42:07 GMT
- Title: FuDGE: A Method to Estimate a Functional Differential Graph in a
High-Dimensional Setting
- Authors: Boxin Zhao, Y. Samuel Wang, Mladen Kolar
- Abstract summary: We consider the problem of estimating the difference between two undirected functional graphical models with shared structures.
We first define a functional differential graph that captures the differences between two functional graphical models.
We then propose a method, FuDGE, that directly estimates the functional differential graph without first estimating each individual graph.
- Score: 17.104487467949113
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of estimating the difference between two undirected
functional graphical models with shared structures. In many applications, data
are naturally regarded as a vector of random functions rather than as a vector
of scalars. For example, electroencephalography (EEG) data are treated more
appropriately as functions of time. In such a problem, not only can the number
of functions measured per sample be large, but each function is itself an
infinite-dimensional object, making estimation of model parameters challenging.
This is further complicated by the fact that curves are usually observed only
at discrete time points. We first define a functional differential graph that
captures the differences between two functional graphical models and formally
characterize when the functional differential graph is well defined. We then
propose a method, FuDGE, that directly estimates the functional differential
graph without first estimating each individual graph. This is particularly
beneficial in settings where the individual graphs are dense but the
differential graph is sparse. We show that FuDGE consistently estimates the
functional differential graph even in a high-dimensional setting for both fully
observed and discretely observed function paths. We illustrate the finite
sample properties of our method through simulation studies. We also propose a
competing method, the Joint Functional Graphical Lasso, which generalizes the
Joint Graphical Lasso to the functional setting. Finally, we apply our method
to EEG data to uncover differences in functional brain connectivity between a
group of individuals with alcohol use disorder and a control group.
Related papers
- Learning High-Dimensional Differential Graphs From Multi-Attribute Data [12.94486861344922]
We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure.
Existing methods for differential graph estimation are based on single-attribute (SA) models.
In this paper, we analyze a group lasso penalized D-trace loss function approach for differential graph learning from multi-attribute data.
arXiv Detail & Related papers (2023-12-05T18:54:46Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - GIF: A General Graph Unlearning Strategy via Influence Function [63.52038638220563]
Graph Influence Function (GIF) is a model-agnostic unlearning method that can efficiently and accurately estimate parameter changes in response to a $epsilon$-mass perturbation in deleted data.
We conduct extensive experiments on four representative GNN models and three benchmark datasets to justify GIF's superiority in terms of unlearning efficacy, model utility, and unlearning efficiency.
arXiv Detail & Related papers (2023-04-06T03:02:54Z) - Low-Rank Covariance Completion for Graph Quilting with Applications to Functional Connectivity [8.500141848121782]
In many calcium imaging data sets, the full population of neurons is not recorded simultaneously, but instead in partially overlapping blocks.
In this paper, we introduce a novel two-step approach to Graph Quilting, which first imputes the nuclear structure matrix using low-rank co completion.
We show the efficacy of these methods for estimating connectivity from calcium imaging data.
arXiv Detail & Related papers (2022-09-17T08:03:46Z) - Learning Graphs from Smooth Signals under Moment Uncertainty [23.868075779606425]
We consider the problem of inferring the graph structure from a given set of graph signals.
Traditional graph learning models do not take this distributional uncertainty into account.
arXiv Detail & Related papers (2021-05-12T06:47:34Z) - Deep Reinforcement Learning of Graph Matching [63.469961545293756]
Graph matching (GM) under node and pairwise constraints has been a building block in areas from optimization to computer vision.
We present a reinforcement learning solver for GM i.e. RGM that seeks the node correspondence between pairwise graphs.
Our method differs from the previous deep graph matching model in the sense that they are focused on the front-end feature extraction and affinity function learning.
arXiv Detail & Related papers (2020-12-16T13:48:48Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Characteristic Functions on Graphs: Birds of a Feather, from Statistical
Descriptors to Parametric Models [8.147652597876862]
We introduce FEATHER, a computationally efficient algorithm to calculate a specific variant of characteristic functions.
We argue that features extracted by FEATHER are useful for node level machine learning tasks.
Experiments on real world large datasets show that our proposed algorithm creates high quality representations.
arXiv Detail & Related papers (2020-05-16T11:47:05Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.