Collaborative likelihood-ratio estimation over graphs
- URL: http://arxiv.org/abs/2205.14461v2
- Date: Wed, 31 Jan 2024 19:27:55 GMT
- Title: Collaborative likelihood-ratio estimation over graphs
- Authors: Alejandro de la Concha and Nicolas Vayatis and Argyris Kalogeratos
- Abstract summary: Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We develop this idea in a concrete non-parametric method that we call Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We derive convergence rates for our collaborative approach that highlights the role played by variables such as the number of available observations per node, the size of the graph, and how accurately the graph structure encodes the similarity between tasks.
- Score: 55.98760097296213
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Assuming we have iid observations from two unknown probability density
functions (pdfs), $p$ and $q$, the likelihood-ratio estimation (LRE) is an
elegant approach to compare the two pdfs only by relying on the available data.
In this paper, we introduce the first -to the best of our knowledge-graph-based
extension of this problem, which reads as follows: Suppose each node $v$ of a
fixed graph has access to observations coming from two unknown node-specific
pdfs, $p_v$ and $q_v$, and the goal is to estimate for each node the
likelihood-ratio between both pdfs by also taking into account the information
provided by the graph structure. The node-level estimation tasks are supposed
to exhibit similarities conveyed by the graph, which suggests that the nodes
could collaborate to solve them more efficiently. We develop this idea in a
concrete non-parametric method that we call Graph-based Relative Unconstrained
Least-squares Importance Fitting (GRULSIF). We derive convergence rates for our
collaborative approach that highlights the role played by variables such as the
number of available observations per node, the size of the graph, and how
accurately the graph structure encodes the similarity between tasks. These
theoretical results explicit the situations where collaborative estimation
effectively leads to an improvement in performance compared to solving each
problem independently. Finally, in a series of experiments, we illustrate how
GRULSIF infers the likelihood-ratios at the nodes of the graph more accurately
compared to state-of-the art LRE methods, which would operate independently at
each node, and we also verify that the behavior of GRULSIF is aligned with our
previous theoretical analysis.
Related papers
- Robust Graph Matching Using An Unbalanced Hierarchical Optimal Transport Framework [30.05543844763625]
We propose a novel and robust graph matching method based on an unbalanced hierarchical optimal transport framework.
We make the first attempt to exploit cross-modal alignment in graph matching.
Experiments on various graph matching tasks demonstrate the superiority and robustness of our method compared to state-of-the-art approaches.
arXiv Detail & Related papers (2023-10-18T16:16:53Z) - Entropy Neural Estimation for Graph Contrastive Learning [9.032721248598088]
Contrastive learning on graphs aims at extracting distinguishable high-level representations of nodes.
We propose a simple yet effective subset sampling strategy to contrast pairwise representations between views of a dataset.
We conduct extensive experiments on seven graph benchmarks, and the proposed approach achieves competitive performance.
arXiv Detail & Related papers (2023-07-26T03:55:08Z) - Line Graph Contrastive Learning for Link Prediction [4.876567687745239]
We propose a Line Graph Contrastive Learning (LGCL) method to obtain multiview information.
With experiments on six public datasets, LGCL outperforms current benchmarks on link prediction tasks.
arXiv Detail & Related papers (2022-10-25T06:57:00Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - CoSimGNN: Towards Large-scale Graph Similarity Computation [5.17905821006887]
Graph Neural Networks (GNNs) provide a data-driven solution for this task.
Existing GNN-based methods, which either respectively embeds two graphs or deploy cross-graph interactions for whole graph pairs, are still not able to achieve competitive results.
We propose the "embedding-coarsening-matching" framework CoSimGNN, which first embeds and coarsens large graphs with adaptive pooling operation and then deploys fine-grained interactions on the coarsened graphs for final similarity scores.
arXiv Detail & Related papers (2020-05-14T16:33:13Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.