Probabilistic Modelling of Signal Mixtures with Differentiable
Dictionaries
- URL: http://arxiv.org/abs/2211.15439v1
- Date: Mon, 28 Nov 2022 15:27:53 GMT
- Title: Probabilistic Modelling of Signal Mixtures with Differentiable
Dictionaries
- Authors: Luk\'a\v{s} Samuel Mart\'ak, Rainer Kelz, Gerhard Widmer
- Abstract summary: We introduce a novel way to incorporate prior information into (semi-) supervised non-negative matrix factorization.
It enables principled modelling of mixtures where non-linear sources are linearly mixed.
- Score: 8.680081568962997
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a novel way to incorporate prior information into (semi-)
supervised non-negative matrix factorization, which we call differentiable
dictionary search. It enables general, highly flexible and principled modelling
of mixtures where non-linear sources are linearly mixed. We study its behavior
on an audio decomposition task, and conduct an extensive, highly controlled
study of its modelling capabilities.
Related papers
- A unified framework for learning with nonlinear model classes from
arbitrary linear samples [0.7366405857677226]
This work considers the fundamental problem of learning an unknown object from training data using a given model class.
We introduce a unified framework that allows for objects in arbitrary Hilbert spaces, general types of (random) linear measurements as training data and general types of nonlinear model classes.
We present examples such as matrix sketching by random sampling, compressed sensing with isotropic vectors, active learning in regression and compressed sensing with generative models.
arXiv Detail & Related papers (2023-11-25T00:43:22Z) - Assessing the overall and partial causal well-specification of nonlinear additive noise models [4.13592995550836]
We aim to identify predictor variables for which we can infer the causal effect even in cases of such misspecifications.
We propose an algorithm for finite sample data, discuss its properties, and illustrate its performance on simulated and real data.
arXiv Detail & Related papers (2023-10-25T09:44:16Z) - Tensor Decompositions Meet Control Theory: Learning General Mixtures of
Linear Dynamical Systems [19.47235707806519]
We give a new approach to learning mixtures of linear dynamical systems based on tensor decompositions.
Our algorithm succeeds without strong separation conditions on the components, and can be used to compete with the Bayes optimal clustering of the trajectories.
arXiv Detail & Related papers (2023-07-13T03:00:01Z) - Differentiable Dictionary Search: Integrating Linear Mixing with Deep
Non-Linear Modelling for Audio Source Separation [8.680081568962997]
This paper describes several improvements to a new method for signal decomposition that we recently formulated under the name Differenti Dictionary Search (DDS)
The fundamental idea is to exploit a class of powerful deep invertible density estimators called normalizing flows, to model the dictionary in a linear decomposition method such as NMF.
As the initial formulation was a proof of concept with some practical limitations, we will present several steps towards making it scalable.
arXiv Detail & Related papers (2022-11-28T16:37:02Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Discovery of Nonlinear Dynamical Systems using a Runge-Kutta Inspired
Dictionary-based Sparse Regression Approach [9.36739413306697]
We blend machine learning and dictionary-based learning with numerical analysis tools to discover governing differential equations.
We obtain interpretable and parsimonious models which are prone to generalize better beyond the sampling regime.
We discuss its extension to governing equations, containing rational nonlinearities that typically appear in biological networks.
arXiv Detail & Related papers (2021-05-11T08:46:51Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Nonlinear Independent Component Analysis for Continuous-Time Signals [85.59763606620938]
We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
arXiv Detail & Related papers (2021-02-04T20:28:44Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [55.28436972267793]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Nonlinear ISA with Auxiliary Variables for Learning Speech
Representations [51.9516685516144]
We introduce a theoretical framework for nonlinear Independent Subspace Analysis (ISA) in the presence of auxiliary variables.
We propose an algorithm that learns unsupervised speech representations whose subspaces are independent.
arXiv Detail & Related papers (2020-07-25T14:53:09Z) - Accounting for Unobserved Confounding in Domain Generalization [107.0464488046289]
This paper investigates the problem of learning robust, generalizable prediction models from a combination of datasets.
Part of the challenge of learning robust models lies in the influence of unobserved confounders.
We demonstrate the empirical performance of our approach on healthcare data from different modalities.
arXiv Detail & Related papers (2020-07-21T08:18:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.