Uncertainty Guarantees on Automated Precision Weeding using Conformal Prediction
- URL: http://arxiv.org/abs/2501.07185v1
- Date: Mon, 13 Jan 2025 10:30:10 GMT
- Title: Uncertainty Guarantees on Automated Precision Weeding using Conformal Prediction
- Authors: Paul Melki, Lionel Bombrun, Boubacar Diallo, Jérôme Dias, Jean-Pierre da Costa,
- Abstract summary: The article showcases conformal prediction in action on the task of precision weeding through deep learning-based image classification.
After a detailed presentation of the conformal prediction methodology, the article evaluates this pipeline on two real-world scenarios.
The results show that we are able to provide formal, i.e. certifiable, guarantees on spraying at least 90% of the weeds.
- Score: 0.5172964916120902
- License:
- Abstract: Precision agriculture in general, and precision weeding in particular, have greatly benefited from the major advancements in deep learning and computer vision. A large variety of commercial robotic solutions are already available and deployed. However, the adoption by farmers of such solutions is still low for many reasons, an important one being the lack of trust in these systems. This is in great part due to the opaqueness and complexity of deep neural networks and the manufacturers' inability to provide valid guarantees on their performance. Conformal prediction, a well-established methodology in the machine learning community, is an efficient and reliable strategy for providing trustworthy guarantees on the predictions of any black-box model under very minimal constraints. Bridging the gap between the safe machine learning and precision agriculture communities, this article showcases conformal prediction in action on the task of precision weeding through deep learning-based image classification. After a detailed presentation of the conformal prediction methodology and the development of a precision spraying pipeline based on a ''conformalized'' neural network and well-defined spraying decision rules, the article evaluates this pipeline on two real-world scenarios: one under in-distribution conditions, the other reflecting a near out-of-distribution setting. The results show that we are able to provide formal, i.e. certifiable, guarantees on spraying at least 90% of the weeds.
Related papers
- Reliable Probabilistic Human Trajectory Prediction for Autonomous Applications [1.8294777056635267]
Vehicle systems need reliable, accurate, fast, resource-efficient, scalable, and low-latency trajectory predictions.
This paper presents a lightweight method to address these requirements, combining Long Short-Term Memory and Mixture Density Networks.
We discuss essential requirements for human trajectory prediction in autonomous vehicle applications and demonstrate our method's performance using traffic-related datasets.
arXiv Detail & Related papers (2024-10-09T14:08:39Z) - Multiclass Alignment of Confidence and Certainty for Network Calibration [10.15706847741555]
Recent studies reveal that deep neural networks (DNNs) are prone to making overconfident predictions.
We propose a new train-time calibration method, which features a simple, plug-and-play auxiliary loss known as multi-class alignment of predictive mean confidence and predictive certainty (MACC)
Our method achieves state-of-the-art calibration performance for both in-domain and out-domain predictions.
arXiv Detail & Related papers (2023-09-06T00:56:24Z) - Group-Conditional Conformal Prediction via Quantile Regression
Calibration for Crop and Weed Classification [0.0]
This article presents the conformal prediction framework that provides valid statistical guarantees on the predictive performance of any black box prediction machine.
The framework is exposed with a focus on its practical aspects and special attention accorded to the Adaptive Prediction Sets (APS) approach.
To tackle this shortcoming, group-conditional conformal approaches are presented.
arXiv Detail & Related papers (2023-08-29T08:02:41Z) - Federated Conformal Predictors for Distributed Uncertainty
Quantification [83.50609351513886]
Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning.
In this paper, we extend conformal prediction to the federated learning setting.
We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction framework.
arXiv Detail & Related papers (2023-05-27T19:57:27Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Leveraging Unlabeled Data to Predict Out-of-Distribution Performance [63.740181251997306]
Real-world machine learning deployments are characterized by mismatches between the source (training) and target (test) distributions.
In this work, we investigate methods for predicting the target domain accuracy using only labeled source data and unlabeled target data.
We propose Average Thresholded Confidence (ATC), a practical method that learns a threshold on the model's confidence, predicting accuracy as the fraction of unlabeled examples.
arXiv Detail & Related papers (2022-01-11T23:01:12Z) - Neural Predictive Monitoring under Partial Observability [4.1316328854247155]
We present a learning-based method for predictive monitoring (PM) that produces accurate and reliable reachability predictions despite partial observability (PO)
Our method results in highly accurate reachability predictions and error detection, as well as tight prediction regions with guaranteed coverage.
arXiv Detail & Related papers (2021-08-16T15:08:20Z) - Multi Agent System for Machine Learning Under Uncertainty in Cyber
Physical Manufacturing System [78.60415450507706]
Recent advancements in predictive machine learning has led to its application in various use cases in manufacturing.
Most research focused on maximising predictive accuracy without addressing the uncertainty associated with it.
In this paper, we determine the sources of uncertainty in machine learning and establish the success criteria of a machine learning system to function well under uncertainty.
arXiv Detail & Related papers (2021-07-28T10:28:05Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Predictive Business Process Monitoring via Generative Adversarial Nets:
The Case of Next Event Prediction [0.026249027950824504]
This paper proposes a novel adversarial training framework to address the problem of next event prediction.
It works by putting one neural network against the other in a two-player game which leads to predictions that are indistinguishable from the ground truth.
It systematically outperforms all baselines both in terms of accuracy and earliness of the prediction, despite using a simple network architecture and a naive feature encoding.
arXiv Detail & Related papers (2020-03-25T08:31:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.