Class-Incremental Learning of Plant and Disease Detection: Growing
Branches with Knowledge Distillation
- URL: http://arxiv.org/abs/2304.06619v2
- Date: Mon, 11 Sep 2023 15:02:00 GMT
- Title: Class-Incremental Learning of Plant and Disease Detection: Growing
Branches with Knowledge Distillation
- Authors: Mathieu Pag\'e Fortin
- Abstract summary: This paper investigates the problem of class-incremental object detection for agricultural applications.
We adapt two public datasets to include new categories over time, simulating a more realistic and dynamic scenario.
We compare three class-incremental learning methods that leverage different forms of knowledge distillation to mitigate catastrophic forgetting.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper investigates the problem of class-incremental object detection for
agricultural applications where a model needs to learn new plant species and
diseases incrementally without forgetting the previously learned ones. We adapt
two public datasets to include new categories over time, simulating a more
realistic and dynamic scenario. We then compare three class-incremental
learning methods that leverage different forms of knowledge distillation to
mitigate catastrophic forgetting. Our experiments show that all three methods
suffer from catastrophic forgetting, but the Dynamic Y-KD approach, which
additionally uses a dynamic architecture that grows new branches to learn new
tasks, outperforms ILOD and Faster-ILOD in most settings both on new and old
classes.
These results highlight the challenges and opportunities of continual object
detection for agricultural applications. In particular, we hypothesize that the
large intra-class and small inter-class variability that is typical of plant
images exacerbate the difficulty of learning new categories without interfering
with previous knowledge. We publicly release our code to encourage future work.
Related papers
- Leveraging Old Knowledge to Continually Learn New Classes in Medical
Images [16.730335437094592]
We focus on how old knowledge can be leveraged to learn new classes without catastrophic forgetting.
Our solution is able to achieve superior performance over state-of-the-art baselines in terms of class accuracy and forgetting.
arXiv Detail & Related papers (2023-03-24T02:10:53Z) - Memorizing Complementation Network for Few-Shot Class-Incremental
Learning [109.4206979528375]
We propose a Memorizing Complementation Network (MCNet) to ensemble multiple models that complements the different memorized knowledge with each other in novel tasks.
We develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel samples away from not only each other in current task but also the old distribution.
arXiv Detail & Related papers (2022-08-11T02:32:41Z) - Continual Learning with Bayesian Model based on a Fixed Pre-trained
Feature Extractor [55.9023096444383]
Current deep learning models are characterised by catastrophic forgetting of old knowledge when learning new classes.
Inspired by the process of learning new knowledge in human brains, we propose a Bayesian generative model for continual learning.
arXiv Detail & Related papers (2022-04-28T08:41:51Z) - LifeLonger: A Benchmark for Continual Disease Classification [59.13735398630546]
We introduce LifeLonger, a benchmark for continual disease classification on the MedMNIST collection.
Task and class incremental learning of diseases address the issue of classifying new samples without re-training the models from scratch.
Cross-domain incremental learning addresses the issue of dealing with datasets originating from different institutions while retaining the previously obtained knowledge.
arXiv Detail & Related papers (2022-04-12T12:25:05Z) - Static-Dynamic Co-Teaching for Class-Incremental 3D Object Detection [71.18882803642526]
Deep learning approaches have shown remarkable performance in the 3D object detection task.
They suffer from a catastrophic performance drop when incrementally learning new classes without revisiting the old data.
This "catastrophic forgetting" phenomenon impedes the deployment of 3D object detection approaches in real-world scenarios.
We present the first solution - SDCoT, a novel static-dynamic co-teaching method.
arXiv Detail & Related papers (2021-12-14T09:03:41Z) - Discriminative Distillation to Reduce Class Confusion in Continual
Learning [57.715862676788156]
Class confusion may play a role in downgrading the classification performance during continual learning.
We propose a discriminative distillation strategy to help the classify well learn the discriminative features between confusing classes.
arXiv Detail & Related papers (2021-08-11T12:46:43Z) - Class-incremental learning: survey and performance evaluation on image
classification [38.27344435075399]
Incremental learning allows for efficient resource usage by eliminating the need to retrain from scratch at the arrival of new data.
The main challenge for incremental learning is catastrophic forgetting, which refers to the precipitous drop in performance on previously learned tasks after learning a new one.
Recently, we have seen a shift towards class-incremental learning where the learner must discriminate at inference time between all classes seen in previous tasks without recourse to a task-ID.
arXiv Detail & Related papers (2020-10-28T23:28:15Z) - Self-Supervised Learning Aided Class-Incremental Lifelong Learning [17.151579393716958]
We study the issue of catastrophic forgetting in class-incremental learning (Class-IL)
In training procedure of Class-IL, as the model has no knowledge about following tasks, it would only extract features necessary for tasks learned so far, whose information is insufficient for joint classification.
We propose to combine self-supervised learning, which can provide effective representations without requiring labels, with Class-IL to partly get around this problem.
arXiv Detail & Related papers (2020-06-10T15:15:27Z) - Few-Shot Class-Incremental Learning [68.75462849428196]
We focus on a challenging but practical few-shot class-incremental learning (FSCIL) problem.
FSCIL requires CNN models to incrementally learn new classes from very few labelled samples, without forgetting the previously learned ones.
We represent the knowledge using a neural gas (NG) network, which can learn and preserve the topology of the feature manifold formed by different classes.
arXiv Detail & Related papers (2020-04-23T03:38:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.