BTS: Bifold Teacher-Student in Semi-Supervised Learning for Indoor
Two-Room Presence Detection Under Time-Varying CSI
- URL: http://arxiv.org/abs/2212.10802v3
- Date: Tue, 6 Jun 2023 05:08:45 GMT
- Title: BTS: Bifold Teacher-Student in Semi-Supervised Learning for Indoor
Two-Room Presence Detection Under Time-Varying CSI
- Authors: Li-Hsiang Shen, Kai-Jui Chen, An-Hung Hsiao, Kai-Ten Feng
- Abstract summary: We propose a bifold teacher-student (BTS) learning approach for indoor human presence detection in an adjoining two-room scenario.
The proposed SSL-based primal-dual teacher-student network intelligently learns spatial and temporal features from labeled and unlabeled CSI datasets.
Experimental results demonstrate that the proposed BTS system sustains accuracy after retraining the model with unlabeled data.
- Score: 4.301276597844756
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, indoor human presence detection based on supervised learning
(SL) and channel state information (CSI) has attracted much attention. However,
existing studies that rely on spatial information of CSI are susceptible to
environmental changes which degrade prediction accuracy. Moreover, SL-based
methods require time-consuming data labeling for retraining models. Therefore,
it is imperative to design a continuously monitored model using a
semi-supervised learning (SSL) based scheme. In this paper, we conceive a
bifold teacher-student (BTS) learning approach for indoor human presence
detection in an adjoining two-room scenario. The proposed SSL-based primal-dual
teacher-student network intelligently learns spatial and temporal features from
labeled and unlabeled CSI datasets. Additionally, the enhanced penalized loss
function leverages entropy and distance measures to distinguish drifted data,
i.e., features of new datasets affected by time-varying effects and altered
from the original distribution. Experimental results demonstrate that the
proposed BTS system sustains asymptotic accuracy after retraining the model
with unlabeled data. Furthermore, BTS outperforms existing SSL-based models in
terms of the highest detection accuracy while achieving the asymptotic
performance of SL-based methods.
Related papers
- PTMs-TSCIL Pre-Trained Models Based Class-Incremental Learning [7.784244204592032]
Class-incremental learning (CIL) for time series data faces challenges in balancing stability against catastrophic forgetting and plasticity for new knowledge acquisition.
We present the first exploration of PTM-based Time Series Class-Incremental Learning (TSCIL)
arXiv Detail & Related papers (2025-03-10T10:27:21Z) - A New Perspective on Time Series Anomaly Detection: Faster Patch-based Broad Learning System [59.38402187365612]
Time series anomaly detection (TSAD) has been a research hotspot in both academia and industry in recent years.
Deep learning is not required for TSAD due to limitations such as slow deep learning speed.
We propose Contrastive Patch-based Broad Learning System (CBLS)
arXiv Detail & Related papers (2024-12-07T01:58:18Z) - Context-Aware Predictive Coding: A Representation Learning Framework for WiFi Sensing [0.0]
WiFi sensing is an emerging technology that utilizes wireless signals for various sensing applications.
In this paper, we introduce a novel SSL framework called Context-Aware Predictive Coding (CAPC)
CAPC effectively learns from unlabelled data and adapts to diverse environments.
Our evaluations demonstrate that CAPC not only outperforms other SSL methods and supervised approaches, but also achieves superior generalization capabilities.
arXiv Detail & Related papers (2024-09-16T17:59:49Z) - Evaluating Fairness in Self-supervised and Supervised Models for
Sequential Data [10.626503137418636]
Self-supervised learning (SSL) has become the de facto training paradigm of large models.
This study explores the impact of pre-training and fine-tuning strategies on fairness.
arXiv Detail & Related papers (2024-01-03T09:31:43Z) - Boosting Transformer's Robustness and Efficacy in PPG Signal Artifact
Detection with Self-Supervised Learning [0.0]
This study addresses the underutilization of abundant unlabeled data by employing self-supervised learning (SSL) to extract latent features from this data.
Our experiments demonstrate that SSL significantly enhances the Transformer model's ability to learn representations.
This approach holds promise for broader applications in PICU environments, where annotated data is often limited.
arXiv Detail & Related papers (2024-01-02T04:00:48Z) - Self-supervised learning for skin cancer diagnosis with limited training data [0.196629787330046]
Self-supervised learning (SSL) is an alternative to the standard supervised pre-training on ImageNet for scenarios with limited training data.
We consider textitfurther SSL pre-training on task-specific datasets, where our implementation is motivated by supervised transfer learning.
We find minimal further SSL pre-training on task-specific data can be as effective as large-scale SSL pre-training on ImageNet for medical image classification tasks with limited labelled data.
arXiv Detail & Related papers (2024-01-01T08:11:38Z) - FILP-3D: Enhancing 3D Few-shot Class-incremental Learning with
Pre-trained Vision-Language Models [62.663113296987085]
Few-shot class-incremental learning aims to mitigate the catastrophic forgetting issue when a model is incrementally trained on limited data.
We introduce two novel components: the Redundant Feature Eliminator (RFE) and the Spatial Noise Compensator (SNC)
Considering the imbalance in existing 3D datasets, we also propose new evaluation metrics that offer a more nuanced assessment of a 3D FSCIL model.
arXiv Detail & Related papers (2023-12-28T14:52:07Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Progressive Feature Adjustment for Semi-supervised Learning from
Pretrained Models [39.42802115580677]
Semi-supervised learning (SSL) can leverage both labeled and unlabeled data to build a predictive model.
Recent literature suggests that naively applying state-of-the-art SSL with a pretrained model fails to unleash the full potential of training data.
We propose to use pseudo-labels from the unlabelled data to update the feature extractor that is less sensitive to incorrect labels.
arXiv Detail & Related papers (2023-09-09T01:57:14Z) - Self-Supervision for Tackling Unsupervised Anomaly Detection: Pitfalls
and Opportunities [50.231837687221685]
Self-supervised learning (SSL) has transformed machine learning and its many real world applications.
Unsupervised anomaly detection (AD) has also capitalized on SSL, by self-generating pseudo-anomalies.
arXiv Detail & Related papers (2023-08-28T07:55:01Z) - DiffSTG: Probabilistic Spatio-Temporal Graph Forecasting with Denoising
Diffusion Models [53.67562579184457]
This paper focuses on probabilistic STG forecasting, which is challenging due to the difficulty in modeling uncertainties and complex dependencies.
We present the first attempt to generalize the popular denoising diffusion models to STGs, leading to a novel non-autoregressive framework called DiffSTG.
Our approach combines the intrinsic-temporal learning capabilities STNNs with the uncertainty measurements of diffusion models.
arXiv Detail & Related papers (2023-01-31T13:42:36Z) - Collaborative Intelligence Orchestration: Inconsistency-Based Fusion of
Semi-Supervised Learning and Active Learning [60.26659373318915]
Active learning (AL) and semi-supervised learning (SSL) are two effective, but often isolated, means to alleviate the data-hungry problem.
We propose an innovative Inconsistency-based virtual aDvErial algorithm to further investigate SSL-AL's potential superiority.
Two real-world case studies visualize the practical industrial value of applying and deploying the proposed data sampling algorithm.
arXiv Detail & Related papers (2022-06-07T13:28:43Z) - Trash to Treasure: Harvesting OOD Data with Cross-Modal Matching for
Open-Set Semi-Supervised Learning [101.28281124670647]
Open-set semi-supervised learning (open-set SSL) investigates a challenging but practical scenario where out-of-distribution (OOD) samples are contained in the unlabeled data.
We propose a novel training mechanism that could effectively exploit the presence of OOD data for enhanced feature learning.
Our approach substantially lifts the performance on open-set SSL and outperforms the state-of-the-art by a large margin.
arXiv Detail & Related papers (2021-08-12T09:14:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.