On Filter Generalization for Music Bandwidth Extension Using Deep Neural
Networks
- URL: http://arxiv.org/abs/2011.07274v2
- Date: Wed, 6 Jan 2021 08:45:20 GMT
- Title: On Filter Generalization for Music Bandwidth Extension Using Deep Neural
Networks
- Authors: Serkan Sulun, Matthew E. P. Davies
- Abstract summary: We formulate the bandwidth extension problem using deep neural networks, where a band-limited signal is provided as input to the network.
Our main contribution centers on the impact of the choice of low pass filter when training and subsequently testing the network.
We propose a data augmentation strategy which utilizes multiple low pass filters during training and leads to improved generalization to unseen filtering conditions at test time.
- Score: 0.40611352512781856
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we address a sub-topic of the broad domain of audio
enhancement, namely musical audio bandwidth extension. We formulate the
bandwidth extension problem using deep neural networks, where a band-limited
signal is provided as input to the network, with the goal of reconstructing a
full-bandwidth output. Our main contribution centers on the impact of the
choice of low pass filter when training and subsequently testing the network.
For two different state of the art deep architectures, ResNet and U-Net, we
demonstrate that when the training and testing filters are matched,
improvements in signal-to-noise ratio (SNR) of up to 7dB can be obtained.
However, when these filters differ, the improvement falls considerably and
under some training conditions results in a lower SNR than the band-limited
input. To circumvent this apparent overfitting to filter shape, we propose a
data augmentation strategy which utilizes multiple low pass filters during
training and leads to improved generalization to unseen filtering conditions at
test time.
Related papers
- FilterNet: Harnessing Frequency Filters for Time Series Forecasting [34.83702192033196]
FilterNet is built upon our proposed learnable frequency filters to extract key informative temporal patterns by selectively passing or attenuating certain components of time series signals.
equipped with the two filters, FilterNet can approximately surrogate the linear and attention mappings widely adopted in time series literature.
arXiv Detail & Related papers (2024-11-03T16:20:41Z) - Music Enhancement with Deep Filters: A Technical Report for The ICASSP 2024 Cadenza Challenge [9.148696434829189]
In this challenge, we disentangle the deep filters from the original DeepfilterNet and incorporate them into our Spec-UNet-based network to further improve a hybrid Demucs (hdemucs) based remixing pipeline.
arXiv Detail & Related papers (2024-04-17T07:01:29Z) - Filter-enhanced MLP is All You Need for Sequential Recommendation [89.0974365344997]
In online platforms, logged user behavior data is inevitable to contain noise.
We borrow the idea of filtering algorithms from signal processing that attenuates the noise in the frequency domain.
We propose textbfFMLP-Rec, an all-MLP model with learnable filters for sequential recommendation task.
arXiv Detail & Related papers (2022-02-28T05:49:35Z) - The Pseudo Projection Operator: Applications of Deep Learning to
Projection Based Filtering in Non-Trivial Frequency Regimes [5.632784019776093]
We introduce a PO-neural network hybrid model, the Pseudo Projection Operator (PPO), which leverages a neural network to perform frequency selection.
We compare the filtering capabilities of a PPO, PO, and denoising autoencoder (DAE) on the University of Rochester Multi-Modal Music Performance dataset.
In the majority of experiments, the PPO outperforms both the PO and DAE.
arXiv Detail & Related papers (2021-11-13T16:09:14Z) - A Machine-Learning-Based Direction-of-Origin Filter for the
Identification of Radio Frequency Interference in the Search for
Technosignatures [0.0]
Convolutional neural networks (CNNs) offer a promising complement to existing filters.
We designed and trained a CNN that can determine whether or not a signal detected in one scan is also present in another scan.
This CNN-based DoO filter outperforms both a baseline 2D correlation model as well as existing DoO filters over a range of metrics.
arXiv Detail & Related papers (2021-07-28T20:22:39Z) - Message Passing in Graph Convolution Networks via Adaptive Filter Banks [81.12823274576274]
We present a novel graph convolution operator, termed BankGCN.
It decomposes multi-channel signals on graphs into subspaces and handles particular information in each subspace with an adapted filter.
It achieves excellent performance in graph classification on a collection of benchmark graph datasets.
arXiv Detail & Related papers (2021-06-18T04:23:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Dependency Aware Filter Pruning [74.69495455411987]
Pruning a proportion of unimportant filters is an efficient way to mitigate the inference cost.
Previous work prunes filters according to their weight norms or the corresponding batch-norm scaling factors.
We propose a novel mechanism to dynamically control the sparsity-inducing regularization so as to achieve the desired sparsity.
arXiv Detail & Related papers (2020-05-06T07:41:22Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z) - Filter Grafting for Deep Neural Networks [71.39169475500324]
Filter grafting aims to improve the representation capability of Deep Neural Networks (DNNs)
We develop an entropy-based criterion to measure the information of filters and an adaptive weighting strategy for balancing the grafted information among networks.
For example, the grafted MobileNetV2 outperforms the non-grafted MobileNetV2 by about 7 percent on CIFAR-100 dataset.
arXiv Detail & Related papers (2020-01-15T03:18:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.