What Makes Graph Neural Networks Miscalibrated?
- URL: http://arxiv.org/abs/2210.06391v1
- Date: Wed, 12 Oct 2022 16:41:42 GMT
- Title: What Makes Graph Neural Networks Miscalibrated?
- Authors: Hans Hao-Hsun Hsu and Yuesong Shen and Christian Tomani and Daniel
Cremers
- Abstract summary: We conduct a systematic study on the calibration qualities of graph neural networks (GNNs)
We identify five factors which influence the calibration of GNNs: general under-confident tendency, diversity of nodewise predictive distributions, distance to training nodes, relative confidence level, and neighborhood similarity.
We design a novel calibration method named Graph Attention Temperature Scaling (GATS), which is tailored for calibrating graph neural networks.
- Score: 48.00374886504513
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Given the importance of getting calibrated predictions and reliable
uncertainty estimations, various post-hoc calibration methods have been
developed for neural networks on standard multi-class classification tasks.
However, these methods are not well suited for calibrating graph neural
networks (GNNs), which presents unique challenges such as accounting for the
graph structure and the graph-induced correlations between the nodes. In this
work, we conduct a systematic study on the calibration qualities of GNN node
predictions. In particular, we identify five factors which influence the
calibration of GNNs: general under-confident tendency, diversity of nodewise
predictive distributions, distance to training nodes, relative confidence
level, and neighborhood similarity. Furthermore, based on the insights from
this study, we design a novel calibration method named Graph Attention
Temperature Scaling (GATS), which is tailored for calibrating graph neural
networks. GATS incorporates designs that address all the identified influential
factors and produces nodewise temperature scaling using an attention-based
architecture. GATS is accuracy-preserving, data-efficient, and expressive at
the same time. Our experiments empirically verify the effectiveness of GATS,
demonstrating that it can consistently achieve state-of-the-art calibration
results on various graph datasets for different GNN backbones.
Related papers
- GETS: Ensemble Temperature Scaling for Calibration in Graph Neural Networks [8.505932176266368]
Graph Neural Networks deliver strong classification results but often suffer from poor calibration performance, leading to overconfidence or underconfidence.
Existing post hoc methods, such as temperature scaling, fail to effectively utilize graph structures, while current GNN calibration methods often overlook the potential of leveraging diverse input information and model ensembles jointly.
In the paper, we propose Graph Ensemble TemperatureScaling, a novel calibration framework that combines input and model ensemble strategies within a Graph Mixture of Experts archi SOTA calibration techniques, reducing expected calibration error by 25 percent across 10 GNN benchmark datasets.
arXiv Detail & Related papers (2024-10-12T15:34:41Z) - Accurate and Scalable Estimation of Epistemic Uncertainty for Graph
Neural Networks [40.95782849532316]
We propose a novel training framework designed to improve intrinsic GNN uncertainty estimates.
Our framework adapts the principle of centering data to graph data through novel graph anchoring strategies.
Our work provides insights into uncertainty estimation for GNNs, and demonstrates the utility of G-$Delta$UQ in obtaining reliable estimates.
arXiv Detail & Related papers (2024-01-07T00:58:33Z) - Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - SimCalib: Graph Neural Network Calibration based on Similarity between
Nodes [60.92081159963772]
Graph neural networks (GNNs) have exhibited impressive performance in modeling graph data as exemplified in various applications.
We shed light on the relationship between GNN calibration and nodewise similarity via theoretical analysis.
A novel calibration framework, named SimCalib, is accordingly proposed to consider similarity between nodes at global and local levels.
arXiv Detail & Related papers (2023-12-19T04:58:37Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - On Calibration of Graph Neural Networks for Node Classification [29.738179864433445]
Graph neural networks learn entity and edge embeddings for tasks such as node classification and link prediction.
These models achieve good performance with respect to accuracy, but the confidence scores associated with the predictions might not be calibrated.
We propose a topology-aware calibration method that takes the neighboring nodes into account and yields improved calibration.
arXiv Detail & Related papers (2022-06-03T13:48:10Z) - Multivariate Time Series Forecasting with Transfer Entropy Graph [5.179058210068871]
We propose a novel end-to-end deep learning model, termed graph neural network with Neural Granger Causality (CauGNN)
Each variable is regarded as a graph node, and each edge represents the casual relationship between variables.
Three benchmark datasets from the real world are used to evaluate the proposed CauGNN.
arXiv Detail & Related papers (2020-05-03T20:51:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.