Gaussian Mixture Convolution Networks
- URL: http://arxiv.org/abs/2202.09153v1
- Date: Fri, 18 Feb 2022 12:07:52 GMT
- Title: Gaussian Mixture Convolution Networks
- Authors: Adam Celarek, Pedro Hermosilla, Bernhard Kerbl, Timo Ropinski, Michael
Wimmer
- Abstract summary: This paper proposes a novel method for deep learning based on the analytical convolution of multidimensional Gaussian mixtures.
We demonstrate that networks based on this architecture reach competitive accuracy on Gaussian mixtures fitted to the MNIST and ModelNet data sets.
- Score: 13.493166990188278
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper proposes a novel method for deep learning based on the analytical
convolution of multidimensional Gaussian mixtures. In contrast to tensors,
these do not suffer from the curse of dimensionality and allow for a compact
representation, as data is only stored where details exist. Convolution kernels
and data are Gaussian mixtures with unconstrained weights, positions, and
covariance matrices. Similar to discrete convolutional networks, each
convolution step produces several feature channels, represented by independent
Gaussian mixtures. Since traditional transfer functions like ReLUs do not
produce Gaussian mixtures, we propose using a fitting of these functions
instead. This fitting step also acts as a pooling layer if the number of
Gaussian components is reduced appropriately. We demonstrate that networks
based on this architecture reach competitive accuracy on Gaussian mixtures
fitted to the MNIST and ModelNet data sets.
Related papers
- Learning Mixtures of Gaussians Using Diffusion Models [9.118706387430883]
We give a new algorithm for learning mixtures of $k$ Gaussians to TV error.
Our approach is analytic and relies on the framework of diffusion models.
arXiv Detail & Related papers (2024-04-29T17:00:20Z) - Efficient Learning of Convolution Weights as Gaussian Mixture Model Posteriors [0.0]
We show that the feature map of a convolution layer is equivalent to the unnormalized log posterior of a special kind of Gaussian mixture for image modeling.
Then we expanded the model to drive diverse features and proposed a corresponding EM algorithm to learn the model.
arXiv Detail & Related papers (2024-01-30T19:35:07Z) - Fast Semisupervised Unmixing Using Nonconvex Optimization [80.11512905623417]
We introduce a novel convex convex model for semi/library-based unmixing.
We demonstrate the efficacy of Alternating Methods of sparse unsupervised unmixing.
arXiv Detail & Related papers (2024-01-23T10:07:41Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - Overlap-guided Gaussian Mixture Models for Point Cloud Registration [61.250516170418784]
Probabilistic 3D point cloud registration methods have shown competitive performance in overcoming noise, outliers, and density variations.
This paper proposes a novel overlap-guided probabilistic registration approach that computes the optimal transformation from matched Gaussian Mixture Model (GMM) parameters.
arXiv Detail & Related papers (2022-10-17T08:02:33Z) - Mixtures of Gaussian Process Experts with SMC$^2$ [0.4588028371034407]
mixtures of Gaussian process experts have been considered where data points are assigned to independent experts.
We construct a novel inference approach based on nested sequential Monte Carlo samplers to infer both the gating network and Gaussian process expert parameters.
arXiv Detail & Related papers (2022-08-26T18:20:14Z) - Gaussian mixture model on nodes of Bayesian network given maximal
parental cliques [0.0]
We will explain why and how we use Gaussian mixture models in network.
We propose a new method, called double iteration algorithm, to optimize the mixture model.
arXiv Detail & Related papers (2022-04-20T15:14:01Z) - A Robust and Flexible EM Algorithm for Mixtures of Elliptical
Distributions with Missing Data [71.9573352891936]
This paper tackles the problem of missing data imputation for noisy and non-Gaussian data.
A new EM algorithm is investigated for mixtures of elliptical distributions with the property of handling potential missing data.
Experimental results on synthetic data demonstrate that the proposed algorithm is robust to outliers and can be used with non-Gaussian data.
arXiv Detail & Related papers (2022-01-28T10:01:37Z) - Clustering of non-Gaussian data by variational Bayes for normal inverse
Gaussian mixture models [0.0]
In practical situations, there are many non-Gaussian data that are heavy-tailed and/or asymmetric.
For NIG mixture models, both expectation-maximization method and variational Bayesian (VB) algorithms have been proposed.
We propose another VB algorithm for NIG mixture that improves on the shortcomings.
We also propose an extension of Dirichlet process mixture models to overcome the difficulty in determining the number of clusters.
arXiv Detail & Related papers (2020-09-13T14:13:27Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.