Unsupervised learning based end-to-end delayless generative fixed-filter
active noise control
- URL: http://arxiv.org/abs/2402.09460v1
- Date: Thu, 8 Feb 2024 06:14:12 GMT
- Title: Unsupervised learning based end-to-end delayless generative fixed-filter
active noise control
- Authors: Zhengding Luo, Dongyuan Shi, Xiaoyi Shen, Woon-Seng Gan
- Abstract summary: Delayless noise control is achieved by our earlier generative fixed-filter active noise control (GFANC) framework.
The one-dimensional convolutional neural network (1D CNN) in the co-processor requires initial training using labelled noise datasets.
We propose an unsupervised-GFANC approach to simplify the 1D CNN training process and enhance its practicality.
- Score: 22.809445468752262
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Delayless noise control is achieved by our earlier generative fixed-filter
active noise control (GFANC) framework through efficient coordination between
the co-processor and real-time controller. However, the one-dimensional
convolutional neural network (1D CNN) in the co-processor requires initial
training using labelled noise datasets. Labelling noise data can be
resource-intensive and may introduce some biases. In this paper, we propose an
unsupervised-GFANC approach to simplify the 1D CNN training process and enhance
its practicality. During training, the co-processor and real-time controller
are integrated into an end-to-end differentiable ANC system. This enables us to
use the accumulated squared error signal as the loss for training the 1D CNN.
With this unsupervised learning paradigm, the unsupervised-GFANC method not
only omits the labelling process but also exhibits better noise reduction
performance compared to the supervised GFANC method in real noise experiments.
Related papers
- Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Direct Unsupervised Denoising [60.71146161035649]
Unsupervised denoisers do not directly produce a single prediction, such as the MMSE estimate.
We present an alternative approach that trains a deterministic network alongside the VAE to directly predict a central tendency.
arXiv Detail & Related papers (2023-10-27T13:02:12Z) - Understanding and Mitigating the Label Noise in Pre-training on
Downstream Tasks [91.15120211190519]
This paper aims to understand the nature of noise in pre-training datasets and to mitigate its impact on downstream tasks.
We propose a light-weight black-box tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise.
arXiv Detail & Related papers (2023-09-29T06:18:15Z) - Exploring Efficient Asymmetric Blind-Spots for Self-Supervised Denoising in Real-World Scenarios [44.31657750561106]
Noise in real-world scenarios is often spatially correlated, which causes many self-supervised algorithms to perform poorly.
We propose Asymmetric Tunable Blind-Spot Network (AT-BSN), where the blind-spot size can be freely adjusted.
We show that our method achieves state-of-the-art, and is superior to other self-supervised algorithms in terms of computational overhead and visual effects.
arXiv Detail & Related papers (2023-03-29T15:19:01Z) - Latent Class-Conditional Noise Model [54.56899309997246]
We introduce a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.
We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels.
Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples.
arXiv Detail & Related papers (2023-02-19T15:24:37Z) - CVF-SID: Cyclic multi-Variate Function for Self-Supervised Image
Denoising by Disentangling Noise from Image [53.76319163746699]
We propose a novel and powerful self-supervised denoising method called CVF-SID.
CVF-SID can disentangle a clean image and noise maps from the input by leveraging various self-supervised loss terms.
It achieves state-of-the-art self-supervised image denoising performance and is comparable to other existing approaches.
arXiv Detail & Related papers (2022-03-24T11:59:28Z) - Denoising Distantly Supervised Named Entity Recognition via a
Hypergeometric Probabilistic Model [26.76830553508229]
Hypergeometric Learning (HGL) is a denoising algorithm for distantly supervised named entity recognition.
HGL takes both noise distribution and instance-level confidence into consideration.
Experiments show that HGL can effectively denoise the weakly-labeled data retrieved from distant supervision.
arXiv Detail & Related papers (2021-06-17T04:01:25Z) - Robust Processing-In-Memory Neural Networks via Noise-Aware
Normalization [26.270754571140735]
PIM accelerators often suffer from intrinsic noise in the physical components.
We propose a noise-agnostic method to achieve robust neural network performance against any noise setting.
arXiv Detail & Related papers (2020-07-07T06:51:28Z) - Simultaneous Denoising and Dereverberation Using Deep Embedding Features [64.58693911070228]
We propose a joint training method for simultaneous speech denoising and dereverberation using deep embedding features.
At the denoising stage, the DC network is leveraged to extract noise-free deep embedding features.
At the dereverberation stage, instead of using the unsupervised K-means clustering algorithm, another neural network is utilized to estimate the anechoic speech.
arXiv Detail & Related papers (2020-04-06T06:34:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.