Optimizing the Induced Correlation in Omnibus Joint Graph Embeddings
- URL: http://arxiv.org/abs/2409.17544v2
- Date: Mon, 30 Sep 2024 23:31:28 GMT
- Title: Optimizing the Induced Correlation in Omnibus Joint Graph Embeddings
- Authors: Konstantinos Pantazis, Michael Trosset, William N. Frost, Carey E. Priebe, Vince Lyzinski,
- Abstract summary: We study the correlation-to-OMNI problem and the flat correlation problem.
In the correlation-to-OMNI problem, we present an algorithm that estimates the matrix of generalized Omnibus weights that induces optimal correlation in the embedding space.
- Score: 14.008892752201593
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Theoretical and empirical evidence suggests that joint graph embedding algorithms induce correlation across the networks in the embedding space. In the Omnibus joint graph embedding framework, previous results explicitly delineated the dual effects of the algorithm-induced and model-inherent correlations on the correlation across the embedded networks. Accounting for and mitigating the algorithm-induced correlation is key to subsequent inference, as sub-optimal Omnibus matrix constructions have been demonstrated to lead to loss in inference fidelity. This work presents the first efforts to automate the Omnibus construction in order to address two key questions in this joint embedding framework: the correlation-to-OMNI problem and the flat correlation problem. In the flat correlation problem, we seek to understand the minimum algorithm-induced flat correlation (i.e., the same across all graph pairs) produced by a generalized Omnibus embedding. Working in a subspace of the fully general Omnibus matrices, we prove both a lower bound for this flat correlation and that the classical Omnibus construction induces the maximal flat correlation. In the correlation-to-OMNI problem, we present an algorithm -- named corr2Omni -- that, from a given matrix of estimated pairwise graph correlations, estimates the matrix of generalized Omnibus weights that induces optimal correlation in the embedding space. Moreover, in both simulated and real data settings, we demonstrate the increased effectiveness of our corr2Omni algorithm versus the classical Omnibus construction.
Related papers
- Fair Canonical Correlation Analysis [14.206538828733507]
Canonical Correlation Analysis (CCA) is a widely used technique for examining the relationship between two sets of variables.
We present a framework that alleviates unfairness by minimizing the correlation disparity error associated with protected attributes.
arXiv Detail & Related papers (2023-09-27T17:34:13Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Efficient Algorithms for Exact Graph Matching on Correlated Stochastic
Block Models with Constant Correlation [7.914348940034351]
We propose an efficient algorithm for matching graphs with community structure.
Our algorithm is the first low-order-time algorithm achieving exact matching between two correlated block models.
arXiv Detail & Related papers (2023-05-31T09:06:50Z) - Causally-guided Regularization of Graph Attention Improves
Generalizability [69.09877209676266]
We introduce CAR, a general-purpose regularization framework for graph attention networks.
Methodname aligns the attention mechanism with the causal effects of active interventions on graph connectivity.
For social media network-sized graphs, a CAR-guided graph rewiring approach could allow us to combine the scalability of graph convolutional methods with the higher performance of graph attention.
arXiv Detail & Related papers (2022-10-20T01:29:10Z) - Multi-task Learning of Order-Consistent Causal Graphs [59.9575145128345]
We consider the problem of discovering $K related Gaussian acyclic graphs (DAGs)
Under multi-task learning setting, we propose a $l_1/l$-regularized maximum likelihood estimator (MLE) for learning $K$ linear structural equation models.
We theoretically show that the joint estimator, by leveraging data across related tasks, can achieve a better sample complexity for recovering the causal order.
arXiv Detail & Related papers (2021-11-03T22:10:18Z) - Counterfactual Maximum Likelihood Estimation for Training Deep Networks [83.44219640437657]
Deep learning models are prone to learning spurious correlations that should not be learned as predictive clues.
We propose a causality-based training framework to reduce the spurious correlations caused by observable confounders.
We conduct experiments on two real-world tasks: Natural Language Inference (NLI) and Image Captioning.
arXiv Detail & Related papers (2021-06-07T17:47:16Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - A Universal Model for Cross Modality Mapping by Relational Reasoning [29.081989993636338]
Cross modality mapping has attracted growing attention in the computer vision community.
We propose a GCN-based Reasoning Network (RR-Net) in which inter and intra relations are efficiently computed.
Experiments on three example tasks, i.e., image classification, social recommendation and sound recognition, clearly demonstrate the superiority and universality of our proposed model.
arXiv Detail & Related papers (2021-02-26T08:56:24Z) - RatE: Relation-Adaptive Translating Embedding for Knowledge Graph
Completion [51.64061146389754]
We propose a relation-adaptive translation function built upon a novel weighted product in complex space.
We then present our Relation-adaptive translating Embedding (RatE) approach to score each graph triple.
arXiv Detail & Related papers (2020-10-10T01:30:30Z) - Exploiting non-i.i.d. data towards more robust machine learning
algorithms [0.0]
Machine learning algorithms have increasingly been shown to excel in finding patterns and correlations from data.
In this paper a regularization scheme is introduced that prefers universal causal correlations.
A better performance is obtained on an out-of-distribution test set with respect to a more conventional l-regularization.
arXiv Detail & Related papers (2020-10-07T14:15:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.