Graphons of Line Graphs
- URL: http://arxiv.org/abs/2409.01656v2
- Date: Tue, 10 Sep 2024 22:17:03 GMT
- Title: Graphons of Line Graphs
- Authors: Sevvandi Kandanaarachchi, Cheng Soon Ong,
- Abstract summary: We show a simple method that can shed light on a subset of sparse graphs.
We show that graphs satisfying a particular property, which we call the square-degree property are sparse, but give rise to dense line graphs.
In particular, star graphs satisfy the square-degree property resulting in dense line graphs and non-zero graphons of line graphs.
- Score: 6.822247359790484
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of estimating graph limits, known as graphons, from observations of sequences of sparse finite graphs. In this paper we show a simple method that can shed light on a subset of sparse graphs. The method involves mapping the original graphs to their line graphs. We show that graphs satisfying a particular property, which we call the square-degree property are sparse, but give rise to dense line graphs. This enables the use of results on graph limits of dense graphs to derive convergence. In particular, star graphs satisfy the square-degree property resulting in dense line graphs and non-zero graphons of line graphs. We demonstrate empirically that we can distinguish different numbers of stars (which are sparse) by the graphons of their corresponding line graphs. Whereas in the original graphs, the different number of stars all converge to the zero graphon due to sparsity. Similarly, superlinear preferential attachment graphs give rise to dense line graphs almost surely. In contrast, dense graphs, including Erdos-Renyi graphs make the line graphs sparse, resulting in the zero graphon.
Related papers
- Parametric Graph Representations in the Era of Foundation Models: A Survey and Position [69.48708136448694]
Graphs have been widely used in the past decades of big data and AI to model comprehensive relational data.
Identifying meaningful graph laws can significantly enhance the effectiveness of various applications.
arXiv Detail & Related papers (2024-10-16T00:01:31Z) - A Graph is Worth $K$ Words: Euclideanizing Graph using Pure Transformer [47.25114679486907]
We introduce GraphsGPT, featuring a Graph2Seq encoder that transforms Non-Euclidean graphs into learnable Graph Words.
A GraphGPT decoder reconstructs the original graph from Graph Words to ensure information equivalence.
arXiv Detail & Related papers (2024-02-04T12:29:40Z) - Finding the Missing-half: Graph Complementary Learning for
Homophily-prone and Heterophily-prone Graphs [48.79929516665371]
Graphs with homophily-prone edges tend to connect nodes with the same class.
Heterophily-prone edges tend to build relationships between nodes with different classes.
Existing GNNs only take the original graph during training.
arXiv Detail & Related papers (2023-06-13T08:06:10Z) - Graph Mixup with Soft Alignments [49.61520432554505]
We study graph data augmentation by mixup, which has been used successfully on images.
We propose S-Mixup, a simple yet effective mixup method for graph classification by soft alignments.
arXiv Detail & Related papers (2023-06-11T22:04:28Z) - Limits, approximation and size transferability for GNNs on sparse graphs
via graphops [44.02161831977037]
We take a perspective of taking limits of operators derived from graphs, such as the aggregation operation that makes up GNNs.
Our results hold for dense and sparse graphs, and various notions of graph limits.
arXiv Detail & Related papers (2023-06-07T15:04:58Z) - A Graph Convolution for Signed Directed Graphs [0.0]
Graph convolutions for signed directed graphs have not been delivered much yet.
We propose a novel complex Hermitian adjacency matrix that encodes graph information via complex numbers.
To the best of our knowledge, it is the first spectral convolution for graphs with signs.
arXiv Detail & Related papers (2022-08-23T01:58:35Z) - G-Mixup: Graph Data Augmentation for Graph Classification [55.63157775049443]
Mixup has shown superiority in improving the generalization and robustness of neural networks by interpolating features and labels between two random samples.
We propose $mathcalG$-Mixup to augment graphs for graph classification by interpolating the generator (i.e., graphon) of different classes of graphs.
Experiments show that $mathcalG$-Mixup substantially improves the generalization and robustness of GNNs.
arXiv Detail & Related papers (2022-02-15T04:09:44Z) - Some Algorithms on Exact, Approximate and Error-Tolerant Graph Matching [3.655021726150369]
We present an extensive survey of various exact and inexact graph matching techniques.
A category of graph matching algorithms is presented, which reduces the graph size by removing the less important nodes.
We introduce a novel approach to measure graph similarity using geometric graphs.
arXiv Detail & Related papers (2020-12-30T18:51:06Z) - High-Order Relation Construction and Mining for Graph Matching [36.880853889521845]
Iterated line graphs are introduced for the first time to describe high-order information.
We present a new graph matching method, called High-order Graph Matching Network (HGMN)
By imposing practical constraints, HGMN is made scalable to large-scale graphs.
arXiv Detail & Related papers (2020-10-09T03:30:02Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.