Self-Supervision for Tackling Unsupervised Anomaly Detection: Pitfalls
and Opportunities
- URL: http://arxiv.org/abs/2308.14380v1
- Date: Mon, 28 Aug 2023 07:55:01 GMT
- Title: Self-Supervision for Tackling Unsupervised Anomaly Detection: Pitfalls
and Opportunities
- Authors: Leman Akoglu and Jaemin Yoo
- Abstract summary: Self-supervised learning (SSL) has transformed machine learning and its many real world applications.
Unsupervised anomaly detection (AD) has also capitalized on SSL, by self-generating pseudo-anomalies.
- Score: 50.231837687221685
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Self-supervised learning (SSL) is a growing torrent that has recently
transformed machine learning and its many real world applications, by learning
on massive amounts of unlabeled data via self-generated supervisory signals.
Unsupervised anomaly detection (AD) has also capitalized on SSL, by
self-generating pseudo-anomalies through various data augmentation functions or
external data exposure. In this vision paper, we first underline the importance
of the choice of SSL strategies on AD performance, by presenting evidences and
studies from the AD literature. Equipped with the understanding that SSL incurs
various hyperparameters (HPs) to carefully tune, we present recent developments
on unsupervised model selection and augmentation tuning for SSL-based AD. We
then highlight emerging challenges and future opportunities; on designing new
pretext tasks and augmentation functions for different data modalities,
creating novel model selection solutions for systematically tuning the SSL HPs,
as well as on capitalizing on the potential of pretrained foundation models on
AD through effective density estimation.
Related papers
- A Survey of the Self Supervised Learning Mechanisms for Vision Transformers [5.152455218955949]
The application of self supervised learning (SSL) in vision tasks has gained significant attention.
We develop a comprehensive taxonomy of systematically classifying the SSL techniques.
We discuss the motivations behind SSL, review popular pre-training tasks, and highlight the challenges and advancements in this field.
arXiv Detail & Related papers (2024-08-30T07:38:28Z) - TE-SSL: Time and Event-aware Self Supervised Learning for Alzheimer's Disease Progression Analysis [6.6584447062231895]
Alzheimer's Dementia (AD) represents one of the most pressing challenges in the field of neurodegenerative disorders.
Recent advancements in deep learning and various representation learning strategies, including self-supervised learning (SSL), have shown significant promise in enhancing medical image analysis.
We propose a novel framework, Time and Even-aware SSL (TE-SSL), which integrates time-to-event and event data as supervisory signals to refine the learning process.
arXiv Detail & Related papers (2024-07-09T13:41:32Z) - Reinforcement Learning-Guided Semi-Supervised Learning [20.599506122857328]
We propose a novel Reinforcement Learning Guided SSL method, RLGSSL, that formulates SSL as a one-armed bandit problem.
RLGSSL incorporates a carefully designed reward function that balances the use of labeled and unlabeled data to enhance generalization performance.
We demonstrate the effectiveness of RLGSSL through extensive experiments on several benchmark datasets and show that our approach achieves consistent superior performance compared to state-of-the-art SSL methods.
arXiv Detail & Related papers (2024-05-02T21:52:24Z) - Can We Break Free from Strong Data Augmentations in Self-Supervised Learning? [18.83003310612038]
Self-supervised learning (SSL) has emerged as a promising solution for addressing the challenge of limited labeled data in deep neural networks (DNNs)
We explore SSL behavior across a spectrum of augmentations, revealing their crucial role in shaping SSL model performance and learning mechanisms.
We propose a novel learning approach that integrates prior knowledge, with the aim of curtailing the need for extensive data augmentations.
arXiv Detail & Related papers (2024-04-15T12:53:48Z) - Understanding and Improving the Role of Projection Head in
Self-Supervised Learning [77.59320917894043]
Self-supervised learning (SSL) aims to produce useful feature representations without access to human-labeled data annotations.
Current contrastive learning approaches append a parametrized projection head to the end of some backbone network to optimize the InfoNCE objective.
This raises a fundamental question: Why is a learnable projection head required if we are to discard it after training?
arXiv Detail & Related papers (2022-12-22T05:42:54Z) - The Geometry of Self-supervised Learning Models and its Impact on
Transfer Learning [62.601681746034956]
Self-supervised learning (SSL) has emerged as a desirable paradigm in computer vision.
We propose a data-driven geometric strategy to analyze different SSL models using local neighborhoods in the feature space induced by each.
arXiv Detail & Related papers (2022-09-18T18:15:38Z) - Semi-Supervised and Unsupervised Deep Visual Learning: A Survey [76.2650734930974]
Semi-supervised learning and unsupervised learning offer promising paradigms to learn from an abundance of unlabeled visual data.
We review the recent advanced deep learning algorithms on semi-supervised learning (SSL) and unsupervised learning (UL) for visual recognition from a unified perspective.
arXiv Detail & Related papers (2022-08-24T04:26:21Z) - Data Augmentation is a Hyperparameter: Cherry-picked Self-Supervision
for Unsupervised Anomaly Detection is Creating the Illusion of Success [30.409069707518466]
Self-supervised learning (SSL) has emerged as a promising alternative to create supervisory signals to real-world problems.
Recent works have reported that the type of augmentation has a significant impact on accuracy.
This work sets out to put image-based SSAD under a larger lens and investigate the role of data augmentation in SSAD.
arXiv Detail & Related papers (2022-08-16T13:09:25Z) - Collaborative Intelligence Orchestration: Inconsistency-Based Fusion of
Semi-Supervised Learning and Active Learning [60.26659373318915]
Active learning (AL) and semi-supervised learning (SSL) are two effective, but often isolated, means to alleviate the data-hungry problem.
We propose an innovative Inconsistency-based virtual aDvErial algorithm to further investigate SSL-AL's potential superiority.
Two real-world case studies visualize the practical industrial value of applying and deploying the proposed data sampling algorithm.
arXiv Detail & Related papers (2022-06-07T13:28:43Z) - DATA: Domain-Aware and Task-Aware Pre-training [94.62676913928831]
We present DATA, a simple yet effective NAS approach specialized for self-supervised learning (SSL)
Our method achieves promising results across a wide range of computation costs on downstream tasks, including image classification, object detection and semantic segmentation.
arXiv Detail & Related papers (2022-03-17T02:38:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.