MaxCutPool: differentiable feature-aware Maxcut for pooling in graph neural networks
- URL: http://arxiv.org/abs/2409.05100v2
- Date: Tue, 10 Sep 2024 08:00:19 GMT
- Title: MaxCutPool: differentiable feature-aware Maxcut for pooling in graph neural networks
- Authors: Carlo Abate, Filippo Maria Bianchi,
- Abstract summary: We propose a novel approach to compute the MAXCUT in attributed graphs, i.e., graphs with features associated with nodes and edges.
Our approach is robust to the underlying graph topology and is fully differentiable, making it possible to find solutions that jointly optimize the MAXCUT along with other objectives.
- Score: 5.524804393257921
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel approach to compute the MAXCUT in attributed graphs, i.e., graphs with features associated with nodes and edges. Our approach is robust to the underlying graph topology and is fully differentiable, making it possible to find solutions that jointly optimize the MAXCUT along with other objectives. Based on the obtained MAXCUT partition, we implement a hierarchical graph pooling layer for Graph Neural Networks, which is sparse, differentiable, and particularly suitable for downstream tasks on heterophilic graphs.
Related papers
- Improving Graph Neural Networks by Learning Continuous Edge Directions [0.0]
Graph Neural Networks (GNNs) traditionally employ a message-passing mechanism that resembles diffusion over undirected graphs.
Our key insight is to assign fuzzy edge directions to the edges of a graph so that features can preferentially flow in one direction between nodes.
We propose a general framework, called Continuous Edge Direction (CoED) GNN, for learning on graphs with fuzzy edges.
arXiv Detail & Related papers (2024-10-18T01:34:35Z) - Graph Similarity Regularized Softmax for Semi-Supervised Node Classification [33.297649538686045]
Graph Neural Networks (GNNs) are powerful deep learning models designed for graph-structured data.
We propose a graph similarity regularized softmax for GNNs in semi-supervised node classification.
arXiv Detail & Related papers (2024-09-20T14:38:16Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Multi-Granularity Graph Pooling for Video-based Person Re-Identification [14.943835935921296]
graph neural networks (GNNs) are introduced to aggregate temporal and spatial features of video samples.
Existing graph-based models, like STGCN, perform the textitmean/textitmax pooling on node features to obtain the graph representation.
We propose the graph pooling network (GPNet) to learn the multi-granularity graph representation for the video retrieval.
arXiv Detail & Related papers (2022-09-23T13:26:05Z) - Deep Manifold Learning with Graph Mining [80.84145791017968]
We propose a novel graph deep model with a non-gradient decision layer for graph mining.
The proposed model has achieved state-of-the-art performance compared to the current models.
arXiv Detail & Related papers (2022-07-18T04:34:08Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Graph Cross Networks with Vertex Infomax Pooling [69.38969610952927]
We propose a novel graph cross network (GXN) to achieve comprehensive feature learning from multiple scales of a graph.
Based on trainable hierarchical representations of a graph, GXN enables the interchange of intermediate features across scales to promote information flow.
arXiv Detail & Related papers (2020-10-05T06:34:23Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.