Generalized Fast Multichannel Nonnegative Matrix Factorization Based on
Gaussian Scale Mixtures for Blind Source Separation
- URL: http://arxiv.org/abs/2205.05330v1
- Date: Wed, 11 May 2022 08:09:39 GMT
- Title: Generalized Fast Multichannel Nonnegative Matrix Factorization Based on
Gaussian Scale Mixtures for Blind Source Separation
- Authors: Mathieu Fontaine (LTCI, RIKEN AIP), Kouhei Sekiguchi (RIKEN AIP),
Aditya Nugraha (RIKEN AIP), Yoshiaki Bando (AIST, RIKEN AIP), Kazuyoshi
Yoshii (RIKEN AIP)
- Abstract summary: This paper describes heavy-tailed extensions of a versatile blind source separation method called FastMNMF.
We develop an expectationmaximization algorithm that works even when the probability density function of the impulse variables have no analytical expressions.
- Score: 3.141085922386211
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper describes heavy-tailed extensions of a state-of-the-art versatile
blind source separation method called fast multichannel nonnegative matrix
factorization (FastMNMF) from a unified point of view. The common way of
deriving such an extension is to replace the multivariate complex Gaussian
distribution in the likelihood function with its heavy-tailed generalization,
e.g., the multivariate complex Student's t and leptokurtic generalized Gaussian
distributions, and tailor-make the corresponding parameter optimization
algorithm. Using a wider class of heavy-tailed distributions called a Gaussian
scale mixture (GSM), i.e., a mixture of Gaussian distributions whose variances
are perturbed by positive random scalars called impulse variables, we propose
GSM-FastMNMF and develop an expectationmaximization algorithm that works even
when the probability density function of the impulse variables have no
analytical expressions. We show that existing heavy-tailed FastMNMF extensions
are instances of GSM-FastMNMF and derive a new instance based on the
generalized hyperbolic distribution that include the normal-inverse Gaussian,
Student's t, and Gaussian distributions as the special cases. Our experiments
show that the normalinverse Gaussian FastMNMF outperforms the state-of-the-art
FastMNMF extensions and ILRMA model in speech enhancement and separation in
terms of the signal-to-distortion ratio.
Related papers
- Convex Parameter Estimation of Perturbed Multivariate Generalized
Gaussian Distributions [18.95928707619676]
We propose a convex formulation with well-established properties for MGGD parameters.
The proposed framework is flexible as it combines a variety of regularizations for the precision matrix, the mean and perturbations.
Experiments show a more accurate precision and covariance matrix estimation with similar performance for the mean vector parameter.
arXiv Detail & Related papers (2023-12-12T18:08:04Z) - On the Computation of the Gaussian Rate-Distortion-Perception Function [10.564071872770146]
We study the computation of the rate-distortion-perception function (RDPF) for a multivariate Gaussian source under mean squared error (MSE) distortion.
We provide the associated algorithmic realization, as well as the convergence and the rate of convergence characterization.
We corroborate our results with numerical simulations and draw connections to existing results.
arXiv Detail & Related papers (2023-11-15T18:34:03Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Approximate Message Passing for Multi-Layer Estimation in Rotationally
Invariant Models [15.605031496980775]
We present a new class of approximate message passing (AMP) algorithms and give a state evolution recursion.
Our results show that this complexity gain comes at little to no cost in the performance of the algorithm.
arXiv Detail & Related papers (2022-12-03T08:10:35Z) - Generalizing Gaussian Smoothing for Random Search [23.381986209234164]
Gaussian smoothing (GS) is a derivative-free optimization algorithm that estimates the gradient of an objective using perturbations of the current benchmarks.
We propose to choose a distribution for perturbations that minimizes the error of such distributions with provably smaller MSE.
arXiv Detail & Related papers (2022-11-27T04:42:05Z) - Stochastic Mirror Descent in Average Ensemble Models [38.38572705720122]
The mirror descent (SMD) is a general class of training algorithms, which includes the celebrated gradient descent (SGD) as a special case.
In this paper we explore the performance of the mirror potential algorithm on mean-field ensemble models.
arXiv Detail & Related papers (2022-10-27T11:04:00Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z) - Training Deep Energy-Based Models with f-Divergence Minimization [113.97274898282343]
Deep energy-based models (EBMs) are very flexible in distribution parametrization but computationally challenging.
We propose a general variational framework termed f-EBM to train EBMs using any desired f-divergence.
Experimental results demonstrate the superiority of f-EBM over contrastive divergence, as well as the benefits of training EBMs using f-divergences other than KL.
arXiv Detail & Related papers (2020-03-06T23:11:13Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.