Bayesian Layer Graph Convolutioanl Network for Hyperspetral Image
Classification
- URL: http://arxiv.org/abs/2211.07316v1
- Date: Mon, 14 Nov 2022 12:56:56 GMT
- Title: Bayesian Layer Graph Convolutioanl Network for Hyperspetral Image
Classification
- Authors: Mingyang Zhang, Ziqi Di, Maoguo Gong, Yue Wu, Hao Li, Xiangming Jiang
- Abstract summary: Graph convolutional network (GCN) based models have shown impressive performance.
Deep learning frameworks based on point estimation suffer from low generalization and inability to quantify the classification results uncertainty.
In this paper, we propose a Bayesian layer with Bayesian idea as an insertion layer into point estimation based neural networks.
A Generative Adversarial Network (GAN) is built to solve the sample imbalance problem of HSI dataset.
- Score: 24.91896527342631
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, research on hyperspectral image (HSI) classification has
continuous progress on introducing deep network models, and recently the graph
convolutional network (GCN) based models have shown impressive performance.
However, these deep learning frameworks based on point estimation suffer from
low generalization and inability to quantify the classification results
uncertainty. On the other hand, simply applying the Bayesian Neural Network
(BNN) based on distribution estimation to classify the HSI is unable to achieve
high classification accuracy due to the large amount of parameters. In this
paper, we design a Bayesian layer with Bayesian idea as an insertion layer into
point estimation based neural networks, and propose a Bayesian Layer Graph
Convolutional Network (BLGCN) model by combining graph convolution operations,
which can effectively extract graph information and estimate the uncertainty of
classification results. Moreover, a Generative Adversarial Network (GAN) is
built to solve the sample imbalance problem of HSI dataset. Finally, we design
a dynamic control training strategy based on the confidence interval of the
classification results, which will terminate the training early when the
confidence interval reaches the preseted threshold. The experimental results
show that our model achieves a balance between high classification accuracy and
strong generalization. In addition, it can quantifies the uncertainty of the
classification results.
Related papers
- Positional Encoder Graph Quantile Neural Networks for Geographic Data [4.277516034244117]
We introduce the Positional Graph Quantile Neural Network (PE-GQNN), a novel method that integrates PE-GNNs, Quantile Neural Networks, and recalibration techniques in a fully nonparametric framework.
Experiments on benchmark datasets demonstrate that PE-GQNN significantly outperforms existing state-of-the-art methods in both predictive accuracy and uncertainty quantification.
arXiv Detail & Related papers (2024-09-27T16:02:12Z) - Graph Mining under Data scarcity [6.229055041065048]
We propose an Uncertainty Estimator framework that can be applied on top of any generic Graph Neural Networks (GNNs)
We train these models under the classic episodic learning paradigm in the $n$-way, $k$-shot fashion, in an end-to-end setting.
Our method outperforms the baselines, which demonstrates the efficacy of the Uncertainty Estimator for Few-shot node classification on graphs with a GNN.
arXiv Detail & Related papers (2024-06-07T10:50:03Z) - DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - Uncertainty Quantification over Graph with Conformalized Graph Neural
Networks [52.20904874696597]
Graph Neural Networks (GNNs) are powerful machine learning prediction models on graph-structured data.
GNNs lack rigorous uncertainty estimates, limiting their reliable deployment in settings where the cost of errors is significant.
We propose conformalized GNN (CF-GNN), extending conformal prediction (CP) to graph-based models for guaranteed uncertainty estimates.
arXiv Detail & Related papers (2023-05-23T21:38:23Z) - Energy-based Out-of-Distribution Detection for Graph Neural Networks [76.0242218180483]
We propose a simple, powerful and efficient OOD detection model for GNN-based learning on graphs, which we call GNNSafe.
GNNSafe achieves up to $17.0%$ AUROC improvement over state-of-the-arts and it could serve as simple yet strong baselines in such an under-developed area.
arXiv Detail & Related papers (2023-02-06T16:38:43Z) - Distribution Free Prediction Sets for Node Classification [0.0]
We leverage recent advances in conformal prediction to construct prediction sets for node classification in inductive learning scenarios.
We show through experiments on standard benchmark datasets using popular GNN models that our approach provides tighter and better prediction sets than a naive application of conformal prediction.
arXiv Detail & Related papers (2022-11-26T12:54:45Z) - Bayesian Convolutional Neural Networks for Limited Data Hyperspectral
Remote Sensing Image Classification [14.464344312441582]
We use a special class of deep neural networks, namely Bayesian neural network, to classify HSRS images.
Bayesian neural networks provide an inherent tool for measuring uncertainty.
We show that a Bayesian network can outperform a similarly-constructed non-Bayesian convolutional neural network (CNN) and an off-the-shelf Random Forest (RF)
arXiv Detail & Related papers (2022-05-19T00:02:16Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Do We Really Need a Learnable Classifier at the End of Deep Neural
Network? [118.18554882199676]
We study the potential of learning a neural network for classification with the classifier randomly as an ETF and fixed during training.
Our experimental results show that our method is able to achieve similar performances on image classification for balanced datasets.
arXiv Detail & Related papers (2022-03-17T04:34:28Z) - Bayesian Graph Neural Networks with Adaptive Connection Sampling [62.51689735630133]
We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs)
The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs.
arXiv Detail & Related papers (2020-06-07T07:06:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.