Investigating the Interplay between Features and Structures in Graph
Learning
- URL: http://arxiv.org/abs/2308.09570v1
- Date: Fri, 18 Aug 2023 14:02:56 GMT
- Title: Investigating the Interplay between Features and Structures in Graph
Learning
- Authors: Daniele Castellana, Federico Errica
- Abstract summary: In the past, it was believed that homophily strongly correlates with better node classification predictions of message-passing methods.
More recently, researchers pointed out that such dichotomy is too simplistic as we can construct node classification tasks where graphs are completely heterophilic but the performances remain high.
Our work investigates what happens when this strong assumption does not hold, by formalising two generative processes for node classification tasks.
- Score: 6.436174170552484
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the past, the dichotomy between homophily and heterophily has inspired
research contributions toward a better understanding of Deep Graph Networks'
inductive bias. In particular, it was believed that homophily strongly
correlates with better node classification predictions of message-passing
methods. More recently, however, researchers pointed out that such dichotomy is
too simplistic as we can construct node classification tasks where graphs are
completely heterophilic but the performances remain high. Most of these works
have also proposed new quantitative metrics to understand when a graph
structure is useful, which implicitly or explicitly assume the correlation
between node features and target labels. Our work empirically investigates what
happens when this strong assumption does not hold, by formalising two
generative processes for node classification tasks that allow us to build and
study ad-hoc problems. To quantitatively measure the influence of the node
features on the target labels, we also use a metric we call Feature
Informativeness. We construct six synthetic tasks and evaluate the performance
of six models, including structure-agnostic ones. Our findings reveal that
previously defined metrics are not adequate when we relax the above assumption.
Our contribution to the workshop aims at presenting novel research findings
that could help advance our understanding of the field.
Related papers
- Multistage non-deterministic classification using secondary concept graphs and graph convolutional networks for high-level feature extraction [0.0]
In domains with diverse topics, graph representations illustrate interrelations among features.
Despite achievements, predicting and assigning 9 deterministic classes often involves errors.
We present a multi-stage non-deterministic classification method based on a secondary conceptual graph and graph convolutional networks.
arXiv Detail & Related papers (2024-11-09T15:28:45Z) - The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Robust Graph Structure Learning under Heterophily [12.557639223778722]
We propose a novel robust graph structure learning method to achieve a high-quality graph from heterophilic data for downstream tasks.
We first apply a high-pass filter to make each node more distinctive from its neighbors by encoding structure information into the node features.
Then, we learn a robust graph with an adaptive norm characterizing different levels of noise.
arXiv Detail & Related papers (2024-03-06T12:29:13Z) - On Discprecncies between Perturbation Evaluations of Graph Neural
Network Attributions [49.8110352174327]
We assess attribution methods from a perspective not previously explored in the graph domain: retraining.
The core idea is to retrain the network on important (or not important) relationships as identified by the attributions.
We run our analysis on four state-of-the-art GNN attribution methods and five synthetic and real-world graph classification datasets.
arXiv Detail & Related papers (2024-01-01T02:03:35Z) - Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks [3.566568169425391]
We show that with increased depth, node representations become dominated by a low-dimensional subspace that depends on the aggregation function but not on the feature transformations.
For all aggregation functions, the rank of the node representations collapses, resulting in over-smoothing for particular aggregation functions.
arXiv Detail & Related papers (2023-08-31T15:22:31Z) - Contrastive Learning for Non-Local Graphs with Multi-Resolution
Structural Views [1.4445779250002606]
We propose a novel multiview contrastive learning approach that integrates diffusion filters on graphs.
By incorporating multiple graph views as augmentations, our method captures the structural equivalence in heterophilic graphs.
arXiv Detail & Related papers (2023-08-19T17:42:02Z) - Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications [52.65389473899139]
Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
arXiv Detail & Related papers (2022-04-01T10:01:37Z) - Deconfounded Training for Graph Neural Networks [98.06386851685645]
We present a new paradigm of decon training (DTP) that better mitigates the confounding effect and latches on the critical information.
Specifically, we adopt the attention modules to disentangle the critical subgraph and trivial subgraph.
It allows GNNs to capture a more reliable subgraph whose relation with the label is robust across different distributions.
arXiv Detail & Related papers (2021-12-30T15:22:35Z) - Exploiting Heterogeneous Graph Neural Networks with Latent Worker/Task
Correlation Information for Label Aggregation in Crowdsourcing [72.34616482076572]
Crowdsourcing has attracted much attention for its convenience to collect labels from non-expert workers instead of experts.
We propose a novel framework based on graph neural networks for aggregating crowd labels.
arXiv Detail & Related papers (2020-10-25T10:12:37Z) - A Survey of Adversarial Learning on Graphs [59.21341359399431]
We investigate and summarize the existing works on graph adversarial learning tasks.
Specifically, we survey and unify the existing works w.r.t. attack and defense in graph analysis tasks.
We emphasize the importance of related evaluation metrics, investigate and summarize them comprehensively.
arXiv Detail & Related papers (2020-03-10T12:48:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.