Nonparametric Modeling of Higher-Order Interactions via Hypergraphons
- URL: http://arxiv.org/abs/2105.08678v1
- Date: Tue, 18 May 2021 17:08:29 GMT
- Title: Nonparametric Modeling of Higher-Order Interactions via Hypergraphons
- Authors: Krishnakumar Balasubramanian
- Abstract summary: We study statistical and algorithmic aspects of using hypergraphons, that are limits of large hypergraphs, for modeling higher-order interactions.
We consider a restricted class of Simple Lipschitz Hypergraphons (SLH), that are amenable to practically efficient estimation.
- Score: 11.6503817521043
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We study statistical and algorithmic aspects of using hypergraphons, that are
limits of large hypergraphs, for modeling higher-order interactions. Although
hypergraphons are extremely powerful from a modeling perspective, we consider a
restricted class of Simple Lipschitz Hypergraphons (SLH), that are amenable to
practically efficient estimation. We also provide rates of convergence for our
estimator that are optimal for the class of SLH. Simulation results are
provided to corroborate the theory.
Related papers
- Enhancing the Utility of Higher-Order Information in Relational Learning [0.9899763598214121]
We evaluate the effectiveness of hypergraph-level and graph-level architectures in relational learning.
We propose hypergraph-level encodings based on classical hypergraph characteristics.
Our theoretical analysis shows that hypergraph-level encodings provably increase the representational power of message-passing graph neural networks.
arXiv Detail & Related papers (2025-02-13T18:28:17Z) - Hypergraphs as Weighted Directed Self-Looped Graphs: Spectral Properties, Clustering, Cheeger Inequality [40.215737469808026]
Hypergraphs arise when studying group relations and have been widely used in the field of machine learning.
There has not been a unified formulation of hypergraphs, yet the recently proposed edge-dependent Rayleigh weights (EDVW) modeling is one of the most generalized modeling methods of hypergraphs.
We propose our definitions of hypergraph Quotient, NCut, boundary/cut, volume, and conductance, which are consistent with the corresponding definitions on graphs.
Then, we prove that the normalized hypergraph Laplacian is associated with the NCut value, which inspires our HyperClus-G algorithm for spectral clustering
arXiv Detail & Related papers (2024-10-23T05:16:48Z) - SPHINX: Structural Prediction using Hypergraph Inference Network [19.853413818941608]
We introduce Structural Prediction using Hypergraph Inference Network (SPHINX), a model that learns to infer a latent hypergraph structure in an unsupervised way.
We show that the recent advancement in k-subset sampling represents a suitable tool for producing discrete hypergraph structures.
The resulting model can generate the higher-order structure necessary for any modern hypergraph neural network.
arXiv Detail & Related papers (2024-10-04T07:49:57Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Core-periphery Models for Hypergraphs [0.0]
We introduce a random hypergraph model for core-periphery structure.
We develop a novel statistical inference algorithm that is able to scale to large hypergraphs with runtime that is practically linear wrt.
Our inference algorithm is capable of learning embeddings that correspond to the reputation (rank) of a node within the hypergraph.
arXiv Detail & Related papers (2022-06-01T22:11:44Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - The Performance of the MLE in the Bradley-Terry-Luce Model in
$\ell_{\infty}$-Loss and under General Graph Topologies [76.61051540383494]
We derive novel, general upper bounds on the $ell_infty$ estimation error of the Bradley-Terry-Luce model.
We demonstrate that the derived bounds perform well and in some cases are sharper compared to known results.
arXiv Detail & Related papers (2021-10-20T23:46:35Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.