Poisson Reweighted Laplacian Uncertainty Sampling for Graph-based Active
Learning
- URL: http://arxiv.org/abs/2210.15786v1
- Date: Thu, 27 Oct 2022 22:07:53 GMT
- Title: Poisson Reweighted Laplacian Uncertainty Sampling for Graph-based Active
Learning
- Authors: Kevin Miller and Jeff Calder
- Abstract summary: We show that uncertainty sampling is sufficient to achieve exploration versus exploitation in graph-based active learning.
In particular, we use a recently developed algorithm, Poisson ReWeighted Laplace Learning (PWLL) for the classifier.
We present experimental results on a number of graph-based image classification problems.
- Score: 1.6752182911522522
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We show that uncertainty sampling is sufficient to achieve exploration versus
exploitation in graph-based active learning, as long as the measure of
uncertainty properly aligns with the underlying model and the model properly
reflects uncertainty in unexplored regions. In particular, we use a recently
developed algorithm, Poisson ReWeighted Laplace Learning (PWLL) for the
classifier and we introduce an acquisition function designed to measure
uncertainty in this graph-based classifier that identifies unexplored regions
of the data. We introduce a diagonal perturbation in PWLL which produces
exponential localization of solutions, and controls the exploration versus
exploitation tradeoff in active learning. We use the well-posed continuum limit
of PWLL to rigorously analyze our method, and present experimental results on a
number of graph-based image classification problems.
Related papers
- Learning Latent Graph Structures and their Uncertainty [63.95971478893842]
Graph Neural Networks (GNNs) use relational information as an inductive bias to enhance the model's accuracy.
As task-relevant relations might be unknown, graph structure learning approaches have been proposed to learn them while solving the downstream prediction task.
arXiv Detail & Related papers (2024-05-30T10:49:22Z) - Uncertainty for Active Learning on Graphs [70.44714133412592]
Uncertainty Sampling is an Active Learning strategy that aims to improve the data efficiency of machine learning models.
We benchmark Uncertainty Sampling beyond predictive uncertainty and highlight a significant performance gap to other Active Learning strategies.
We develop ground-truth Bayesian uncertainty estimates in terms of the data generating process and prove their effectiveness in guiding Uncertainty Sampling toward optimal queries.
arXiv Detail & Related papers (2024-05-02T16:50:47Z) - From Registration Uncertainty to Segmentation Uncertainty [11.294691606431526]
We propose a novel framework to concurrently estimate both the epistemic and aleatoric segmentation uncertainties for image registration.
By introducing segmentation uncertainty along with existing methods for estimating registration uncertainty, we offer vital insights into the potential uncertainties at different stages of image registration.
arXiv Detail & Related papers (2024-03-08T07:16:14Z) - Querying Easily Flip-flopped Samples for Deep Active Learning [63.62397322172216]
Active learning is a machine learning paradigm that aims to improve the performance of a model by strategically selecting and querying unlabeled data.
One effective selection strategy is to base it on the model's predictive uncertainty, which can be interpreted as a measure of how informative a sample is.
This paper proposes the it least disagree metric (LDM) as the smallest probability of disagreement of the predicted label.
arXiv Detail & Related papers (2024-01-18T08:12:23Z) - Model-Based Uncertainty in Value Functions [89.31922008981735]
We focus on characterizing the variance over values induced by a distribution over MDPs.
Previous work upper bounds the posterior variance over values by solving a so-called uncertainty Bellman equation.
We propose a new uncertainty Bellman equation whose solution converges to the true posterior variance over values.
arXiv Detail & Related papers (2023-02-24T09:18:27Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Unsupervised Embedding Learning from Uncertainty Momentum Modeling [37.674449317054716]
We propose a novel solution to explicitly model and explore the uncertainty of the given unlabeled learning samples.
We leverage such uncertainty modeling momentum to the learning which is helpful to tackle the outliers.
arXiv Detail & Related papers (2021-07-19T14:06:19Z) - Ask-n-Learn: Active Learning via Reliable Gradient Representations for
Image Classification [29.43017692274488]
Deep predictive models rely on human supervision in the form of labeled training data.
We propose Ask-n-Learn, an active learning approach based on gradient embeddings obtained using the pesudo-labels estimated in each of the algorithm.
arXiv Detail & Related papers (2020-09-30T05:19:56Z) - Graph Embedding with Data Uncertainty [113.39838145450007]
spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines.
Most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty.
arXiv Detail & Related papers (2020-09-01T15:08:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.