Learning the effective order of a hypergraph dynamical system
- URL: http://arxiv.org/abs/2306.01813v1
- Date: Fri, 2 Jun 2023 09:04:45 GMT
- Title: Learning the effective order of a hypergraph dynamical system
- Authors: Leonie Neuh\"auser, Michael Scholkemper, Francesco Tudisco, Michael T.
Schaub
- Abstract summary: We propose a method to determine the minimum order of a hypergraph necessary to approximate the corresponding dynamics accurately.
Specifically, we develop an analytical framework that allows us to determine this order when the type of dynamics is known.
We utilize these ideas in conjunction with a hypergraph neural network to directly learn the dynamics itself and the resulting order of the hypergraph from both synthetic and real data sets.
- Score: 4.530678016396476
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamical systems on hypergraphs can display a rich set of behaviours not
observable for systems with pairwise interactions. Given a distributed
dynamical system with a putative hypergraph structure, an interesting question
is thus how much of this hypergraph structure is actually necessary to
faithfully replicate the observed dynamical behaviour. To answer this question,
we propose a method to determine the minimum order of a hypergraph necessary to
approximate the corresponding dynamics accurately. Specifically, we develop an
analytical framework that allows us to determine this order when the type of
dynamics is known. We utilize these ideas in conjunction with a hypergraph
neural network to directly learn the dynamics itself and the resulting order of
the hypergraph from both synthetic and real data sets consisting of observed
system trajectories.
Related papers
- Graph Attention Inference of Network Topology in Multi-Agent Systems [0.0]
Our work introduces a novel machine learning-based solution that leverages the attention mechanism to predict future states of multi-agent systems.
The graph structure is then inferred from the strength of the attention values.
Our results demonstrate that the presented data-driven graph attention machine learning model can identify the network topology in multi-agent systems.
arXiv Detail & Related papers (2024-08-27T23:58:51Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Learning from Heterogeneity: A Dynamic Learning Framework for Hypergraphs [22.64740740462169]
We propose a hypergraph learning framework named LFH that is capable of dynamic hyperedge construction and attentive embedding update.
To evaluate the effectiveness of our proposed framework, we conduct comprehensive experiments on several popular datasets.
arXiv Detail & Related papers (2023-07-07T06:26:44Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Deep Hypergraph Structure Learning [34.972686247703024]
Learning on high-order correlation has shown superiority in data representation learning, where hypergraph has been widely used in recent decades.
How to generate the hypergraph structure among data is still a challenging task.
DeepHGSL is designed to optimize the hypergraph structure for hypergraph-based representation learning.
arXiv Detail & Related papers (2022-08-26T10:00:11Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Measuring dynamical systems on directed hyper-graphs [0.0]
We analyze the interplay between the structure of a directed hypergraph and a linear dynamical system, a random walk, defined on it.
We apply known measures to pairwise structures, such as the transition matrix, and determine a family of measures that are amenable to such procedure.
arXiv Detail & Related papers (2022-02-25T16:39:40Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.