Investigation of Densely Connected Convolutional Networks with Domain
Adversarial Learning for Noise Robust Speech Recognition
- URL: http://arxiv.org/abs/2112.10108v1
- Date: Sun, 19 Dec 2021 10:29:17 GMT
- Title: Investigation of Densely Connected Convolutional Networks with Domain
Adversarial Learning for Noise Robust Speech Recognition
- Authors: Chia Yu Li and Ngoc Thang Vu
- Abstract summary: We investigate densely connected convolutional networks (DenseNets) and their extension with domain adversarial training for noise robust speech recognition.
DenseNets are very deep, compact convolutional neural networks which have demonstrated incredible improvements over the state-of-the-art results in computer vision.
- Score: 41.88097793717185
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate densely connected convolutional networks (DenseNets) and their
extension with domain adversarial training for noise robust speech recognition.
DenseNets are very deep, compact convolutional neural networks which have
demonstrated incredible improvements over the state-of-the-art results in
computer vision. Our experimental results reveal that DenseNets are more robust
against noise than other neural network based models such as deep feed forward
neural networks and convolutional neural networks. Moreover, domain adversarial
learning can further improve the robustness of DenseNets against both, known
and unknown noise conditions.
Related papers
- Spiking Generative Adversarial Network with Attention Scoring Decoding [4.5727987473456055]
Spiking neural networks offer a closer approximation to brain-like processing.
We build a spiking generative adversarial network capable of handling complex images.
arXiv Detail & Related papers (2023-05-17T14:35:45Z) - Searching for the Essence of Adversarial Perturbations [73.96215665913797]
We show that adversarial perturbations contain human-recognizable information, which is the key conspirator responsible for a neural network's erroneous prediction.
This concept of human-recognizable information allows us to explain key features related to adversarial perturbations.
arXiv Detail & Related papers (2022-05-30T18:04:57Z) - Stochastic resonance neurons in artificial neural networks [0.0]
We propose a new type of neural networks using resonances as an inherent part of the architecture.
We show that such a neural network is more robust against the impact of noise.
arXiv Detail & Related papers (2022-05-06T18:42:36Z) - Noise mitigation strategies in physical feedforward neural networks [0.0]
Physical neural networks are promising candidates for next generation artificial intelligence hardware.
We introduce connectivity topologies, ghost neurons as well as pooling as noise mitigation strategies.
We demonstrate the effectiveness of the combined methods based on a fully trained neural network classifying the MNIST handwritten digits.
arXiv Detail & Related papers (2022-04-20T13:51:46Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Provable Regret Bounds for Deep Online Learning and Control [77.77295247296041]
We show that any loss functions can be adapted to optimize the parameters of a neural network such that it competes with the best net in hindsight.
As an application of these results in the online setting, we obtain provable bounds for online control controllers.
arXiv Detail & Related papers (2021-10-15T02:13:48Z) - Building Compact and Robust Deep Neural Networks with Toeplitz Matrices [93.05076144491146]
This thesis focuses on the problem of training neural networks which are compact, easy to train, reliable and robust to adversarial examples.
We leverage the properties of structured matrices from the Toeplitz family to build compact and secure neural networks.
arXiv Detail & Related papers (2021-09-02T13:58:12Z) - Understanding and mitigating noise in trained deep neural networks [0.0]
We study the propagation of noise in deep neural networks comprising noisy nonlinear neurons in trained fully connected layers.
We find that noise accumulation is generally bound, and adding additional network layers does not worsen the signal to noise ratio beyond a limit.
We identify criteria allowing engineers to design noise-resilient novel neural network hardware.
arXiv Detail & Related papers (2021-03-12T17:16:26Z) - ResiliNet: Failure-Resilient Inference in Distributed Neural Networks [56.255913459850674]
We introduce ResiliNet, a scheme for making inference in distributed neural networks resilient to physical node failures.
Failout simulates physical node failure conditions during training using dropout, and is specifically designed to improve the resiliency of distributed neural networks.
arXiv Detail & Related papers (2020-02-18T05:58:24Z) - Noisy Machines: Understanding Noisy Neural Networks and Enhancing
Robustness to Analog Hardware Errors Using Distillation [12.30062870698165]
We show how a noisy neural network has reduced learning capacity as a result of loss of mutual information between its input and output.
We propose using knowledge distillation combined with noise injection during training to achieve more noise robust networks.
Our method achieves models with as much as two times greater noise tolerance compared with the previous best attempts.
arXiv Detail & Related papers (2020-01-14T18:59:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.