Projection pursuit based on Gaussian mixtures and evolutionary
algorithms
- URL: http://arxiv.org/abs/1912.12049v1
- Date: Fri, 27 Dec 2019 10:25:41 GMT
- Title: Projection pursuit based on Gaussian mixtures and evolutionary
algorithms
- Authors: Luca Scrucca and Alessio Serafini
- Abstract summary: We propose a projection pursuit (PP) algorithm based on Gaussian mixture models (GMMs)
We show that this semi-parametric approach to PP is flexible and allows highly informative structures to be detected.
The performance of the proposed approach is shown on both artificial and real datasets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a projection pursuit (PP) algorithm based on Gaussian mixture
models (GMMs). The negentropy obtained from a multivariate density estimated by
GMMs is adopted as the PP index to be maximised. For a fixed dimension of the
projection subspace, the GMM-based density estimation is projected onto that
subspace, where an approximation of the negentropy for Gaussian mixtures is
computed. Then, Genetic Algorithms (GAs) are used to find the optimal,
orthogonal projection basis by maximising the former approximation. We show
that this semi-parametric approach to PP is flexible and allows highly
informative structures to be detected, by projecting multivariate datasets onto
a subspace, where the data can be feasibly visualised. The performance of the
proposed approach is shown on both artificial and real datasets.
Related papers
- Graph Laplacian-based Bayesian Multi-fidelity Modeling [1.383698759122035]
A graph Laplacian constructed from the low-fidelity data is used to define a multivariate Gaussian prior density.
Few high-fidelity data points are used to construct a conjugate likelihood term.
The results demonstrate that by utilizing a small fraction of high-fidelity data, the multi-fidelity approach can significantly improve the accuracy of a large collection of low-fidelity data points.
arXiv Detail & Related papers (2024-09-12T16:51:55Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - GPS-Gaussian: Generalizable Pixel-wise 3D Gaussian Splatting for Real-time Human Novel View Synthesis [70.24111297192057]
We present a new approach, termed GPS-Gaussian, for synthesizing novel views of a character in a real-time manner.
The proposed method enables 2K-resolution rendering under a sparse-view camera setting.
arXiv Detail & Related papers (2023-12-04T18:59:55Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A similarity-based Bayesian mixture-of-experts model [0.5156484100374058]
We present a new non-parametric mixture-of-experts model for multivariate regression problems.
Using a conditionally specified model, predictions for out-of-sample inputs are based on similarities to each observed data point.
Posterior inference is performed on the parameters of the mixture as well as the distance metric.
arXiv Detail & Related papers (2020-12-03T18:08:30Z) - Distributed Variational Bayesian Algorithms Over Sensor Networks [6.572330981878818]
We propose two novel distributed VB algorithms for general Bayesian inference problem.
The proposed algorithms have excellent performance, which are almost as good as the corresponding centralized VB algorithm relying on all data available in a fusion center.
arXiv Detail & Related papers (2020-11-27T08:12:18Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian
Processes [3.564709604457361]
"Projection" mapping consists of an orthonormal matrix that is considered a priori unknown and needs to be inferred jointly with the GP parameters.
We extend the proposed framework to multi-fidelity models using GPs including the scenarios of training multiple outputs together.
The benefits of our proposed framework, are illustrated on the computationally challenging three-dimensional aerodynamic optimization of a last-stage blade for an industrial gas turbine.
arXiv Detail & Related papers (2020-08-05T22:28:53Z) - Two-Dimensional Semi-Nonnegative Matrix Factorization for Clustering [50.43424130281065]
We propose a new Semi-Nonnegative Matrix Factorization method for 2-dimensional (2D) data, named TS-NMF.
It overcomes the drawback of existing methods that seriously damage the spatial information of the data by converting 2D data to vectors in a preprocessing step.
arXiv Detail & Related papers (2020-05-19T05:54:14Z) - A fast and efficient Modal EM algorithm for Gaussian mixtures [0.0]
In the modal approach to clustering, clusters are defined as the local maxima of the underlying probability density function.
The Modal EM algorithm is an iterative procedure that can identify the local maxima of any density function.
arXiv Detail & Related papers (2020-02-10T08:34:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.