Nonlinear Independent Component Analysis for Continuous-Time Signals
- URL: http://arxiv.org/abs/2102.02876v1
- Date: Thu, 4 Feb 2021 20:28:44 GMT
- Title: Nonlinear Independent Component Analysis for Continuous-Time Signals
- Authors: Harald Oberhauser and Alexander Schell
- Abstract summary: We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
- Score: 85.59763606620938
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the classical problem of recovering a multidimensional source
process from observations of nonlinear mixtures of this process. Assuming
statistical independence of the coordinate processes of the source, we show
that this recovery is possible for many popular models of stochastic processes
(up to order and monotone scaling of their coordinates) if the mixture is given
by a sufficiently differentiable, invertible function. Key to our approach is
the combination of tools from stochastic analysis and recent contrastive
learning approaches to nonlinear ICA. This yields a scalable method with widely
applicable theoretical guarantees for which our experiments indicate good
performance.
Related papers
- A Stability Principle for Learning under Non-Stationarity [1.1510009152620668]
We develop a versatile framework for statistical learning in non-stationary environments.
At the heart of our analysis lie two novel components: a measure of similarity between functions and a segmentation technique for dividing the non-stationary data sequence into quasi-stationary pieces.
arXiv Detail & Related papers (2023-10-27T17:53:53Z) - Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence [65.63201894457404]
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of non-linear differential equations.
The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations.
arXiv Detail & Related papers (2023-05-24T20:43:47Z) - Nonparametric Independent Component Analysis for the Sources with Mixed
Spectra [0.06445605125467573]
Most existing ICA procedures assume independent sampling.
Second-order-statistics-based source separation methods have been developed based on parametric time series models for the mixtures from the autocorrelated sources.
We propose a new ICA method by estimating spectral density functions and line spectra of the source signals using cubic splines and indicator functions.
arXiv Detail & Related papers (2022-12-13T02:13:14Z) - Moment Estimation for Nonparametric Mixture Models Through Implicit
Tensor Decomposition [7.139680863764187]
We present an alternating least squares type numerical optimization scheme to estimate conditionally-independent mixture models in $mathbbRn$.
We compute the cumulative distribution functions, higher moments and other statistics of the component distributions through linear solves.
Numerical experiments demonstrate the competitive performance of the algorithm, and its applicability to many models and applications.
arXiv Detail & Related papers (2022-10-25T23:31:33Z) - Uncertainty Disentanglement with Non-stationary Heteroscedastic Gaussian
Processes for Active Learning [10.757942829334057]
We propose a Non-stationary Heteroscedastic Gaussian process model which can be learned with gradient-based techniques.
We demonstrate the interpretability of the proposed model by separating the overall uncertainty into aleatoric (irreducible) and epistemic (model) uncertainty.
arXiv Detail & Related papers (2022-10-20T02:18:19Z) - On the Identifiability of Nonlinear ICA: Sparsity and Beyond [20.644375143901488]
How to make the nonlinear ICA model identifiable up to certain trivial indeterminacies is a long-standing problem in unsupervised learning.
Recent breakthroughs reformulate the standard independence assumption of sources as conditional independence given some auxiliary variables.
We show that under specific instantiations of such constraints, the independent latent sources can be identified from their nonlinear mixtures up to a permutation.
arXiv Detail & Related papers (2022-06-15T18:24:22Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Learning Stochastic Behaviour from Aggregate Data [52.012857267317784]
Learning nonlinear dynamics from aggregate data is a challenging problem because the full trajectory of each individual is not available.
We propose a novel method using the weak form of Fokker Planck Equation (FPE) to describe the density evolution of data in a sampled form.
In such a sample-based framework we are able to learn the nonlinear dynamics from aggregate data without explicitly solving the partial differential equation (PDE) FPE.
arXiv Detail & Related papers (2020-02-10T03:20:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.