WICA: nonlinear weighted ICA
- URL: http://arxiv.org/abs/2001.04147v2
- Date: Wed, 9 Dec 2020 21:37:54 GMT
- Title: WICA: nonlinear weighted ICA
- Authors: Andrzej Bedychaj, Przemys{\l}aw Spurek, Aleksandra Nowak, Jacek Tabor
- Abstract summary: Independent Component Analysis (ICA) aims to find a coordinate system in which the components of the data are independent.
We construct a new nonlinear ICA model, called WICA, which obtains better and more stable results than other algorithms.
- Score: 72.02008296553318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Independent Component Analysis (ICA) aims to find a coordinate system in
which the components of the data are independent. In this paper we construct a
new nonlinear ICA model, called WICA, which obtains better and more stable
results than other algorithms. A crucial tool is given by a new efficient
method of verifying nonlinear dependence with the use of computation of
correlation coefficients for normally weighted data. In addition, authors
propose a new baseline nonlinear mixing to perform comparable experiments, and
a~reliable measure which allows fair comparison of nonlinear models. Our code
for WICA is available on Github https://github.com/gmum/wica.
Related papers
- Ordering-Based Causal Discovery for Linear and Nonlinear Relations [7.920599542957298]
CaPS is an ordering-based causal discovery algorithm that effectively handles linear and nonlinear relations.
Results obtained from real-world data also support the competitiveness of CaPS.
arXiv Detail & Related papers (2024-10-08T10:33:18Z) - Identifiable Feature Learning for Spatial Data with Nonlinear ICA [18.480534062833673]
We introduce a new nonlinear ICA framework that employs latent components which apply naturally to data with higher-dimensional dependency structures.
In particular, we develop a new learning and algorithm that extends variational methods to handle the combination of a deep neural network mixing function with the TP prior inducing computational efficacy.
arXiv Detail & Related papers (2023-11-28T15:00:11Z) - Efficient Interpretable Nonlinear Modeling for Multiple Time Series [5.448070998907116]
This paper proposes an efficient nonlinear modeling approach for multiple time series.
It incorporates nonlinear interactions among different time-series variables.
Experimental results show that the proposed algorithm improves the identification of the support of the VAR coefficients in a parsimonious manner.
arXiv Detail & Related papers (2023-09-29T11:42:59Z) - Learning new physics efficiently with nonparametric methods [11.970219534238444]
We present a machine learning approach for model-independent new physics searches.
The corresponding algorithm is powered by recent large-scale implementations of kernel methods.
We show that our approach has dramatic advantages compared to neural network implementations in terms of training times and computational resources.
arXiv Detail & Related papers (2022-04-05T16:17:59Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - LQF: Linear Quadratic Fine-Tuning [114.3840147070712]
We present the first method for linearizing a pre-trained model that achieves comparable performance to non-linear fine-tuning.
LQF consists of simple modifications to the architecture, loss function and optimization typically used for classification.
arXiv Detail & Related papers (2020-12-21T06:40:20Z) - A polynomial-time algorithm for learning nonparametric causal graphs [18.739085486953698]
The analysis is model-free and does not assume linearity, additivity, independent noise, or faithfulness.
We impose a condition on the residual variances that is closely related to previous work on linear models with equal variances.
arXiv Detail & Related papers (2020-06-22T02:21:53Z) - Learning nonlinear dynamical systems from a single trajectory [102.60042167341956]
We introduce algorithms for learning nonlinear dynamical systems of the form $x_t+1=sigma(Thetastarx_t)+varepsilon_t$.
We give an algorithm that recovers the weight matrix $Thetastar$ from a single trajectory with optimal sample complexity and linear running time.
arXiv Detail & Related papers (2020-04-30T10:42:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.