Go with the Flows: Mixtures of Normalizing Flows for Point Cloud
Generation and Reconstruction
- URL: http://arxiv.org/abs/2106.03135v1
- Date: Sun, 6 Jun 2021 14:25:45 GMT
- Title: Go with the Flows: Mixtures of Normalizing Flows for Point Cloud
Generation and Reconstruction
- Authors: Janis Postels, Mengya Liu, Riccardo Spezialetti, Luc Van Gool,
Federico Tombari
- Abstract summary: normalizing flows (NFs) have demonstrated state-of-the-art performance on modeling 3D point clouds.
This work enhances their representational power by applying mixtures of NFs to point clouds.
- Score: 98.38585659305325
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently normalizing flows (NFs) have demonstrated state-of-the-art
performance on modeling 3D point clouds while allowing sampling with arbitrary
resolution at inference time. However, these flow-based models still require
long training times and large models for representing complicated geometries.
This work enhances their representational power by applying mixtures of NFs to
point clouds. We show that in this more general framework each component learns
to specialize in a particular subregion of an object in a completely
unsupervised fashion. By instantiating each mixture component with a
comparatively small NF we generate point clouds with improved details compared
to single-flow-based models while using fewer parameters and considerably
reducing the inference runtime. We further demonstrate that by adding data
augmentation, individual mixture components can learn to specialize in a
semantically meaningful manner. We evaluate mixtures of NFs on generation,
autoencoding and single-view reconstruction based on the ShapeNet dataset.
Related papers
- Segmenting objects with Bayesian fusion of active contour models and convnet priors [0.729597981661727]
We propose a novel instance segmentation method geared towards Natural Resource Monitoring (NRM) imagery.
We formulate the problem as Bayesian maximum a posteriori inference which, in learning the individual object contours, incorporates shape, location, and position priors.
In experiments, we tackle the challenging, real-world problem of segmenting individual dead tree crowns and precise contours.
arXiv Detail & Related papers (2024-10-09T20:36:43Z) - ComboStoc: Combinatorial Stochasticity for Diffusion Generative Models [65.82630283336051]
We show that the space spanned by the combination of dimensions and attributes is insufficiently sampled by existing training scheme of diffusion generative models.
We present a simple fix to this problem by constructing processes that fully exploit the structures, hence the name ComboStoc.
arXiv Detail & Related papers (2024-05-22T15:23:10Z) - EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion [0.7255608805275865]
We present two novel methods that generate LHC jets as point clouds efficiently and accurately.
epcjedi and ep both achieve state-of-the-art performance on the top-quark JetNet datasets.
arXiv Detail & Related papers (2023-09-29T18:00:03Z) - StarNet: Style-Aware 3D Point Cloud Generation [82.30389817015877]
StarNet is able to reconstruct and generate high-fidelity and even 3D point clouds using a mapping network.
Our framework achieves comparable state-of-the-art performance on various metrics in the point cloud reconstruction and generation tasks.
arXiv Detail & Related papers (2023-03-28T08:21:44Z) - PRANC: Pseudo RAndom Networks for Compacting deep models [22.793523211040682]
PRANC enables significant compaction of a deep model.
In this study, we employ PRANC to condense image classification models and compress images by compacting their associated implicit neural networks.
arXiv Detail & Related papers (2022-06-16T22:03:35Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - PU-Flow: a Point Cloud Upsampling Networkwith Normalizing Flows [58.96306192736593]
We present PU-Flow, which incorporates normalizing flows and feature techniques to produce dense points uniformly distributed on the underlying surface.
Specifically, we formulate the upsampling process as point in a latent space, where the weights are adaptively learned from local geometric context.
We show that our method outperforms state-of-the-art deep learning-based approaches in terms of reconstruction quality, proximity-to-surface accuracy, and computation efficiency.
arXiv Detail & Related papers (2021-07-13T07:45:48Z) - Discrete Point Flow Networks for Efficient Point Cloud Generation [36.03093265136374]
Generative models have proven effective at modeling 3D shapes and their statistical variations.
We introduce a latent variable model that builds on normalizing flows to generate 3D point clouds of an arbitrary size.
For single-view shape reconstruction we also obtain results on par with state-of-the-art voxel, point cloud, and mesh-based methods.
arXiv Detail & Related papers (2020-07-20T14:48:00Z) - Normalizing Flows with Multi-Scale Autoregressive Priors [131.895570212956]
We introduce channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR)
Our mAR prior for models with split coupling flow layers (mAR-SCF) can better capture dependencies in complex multimodal data.
We show that mAR-SCF allows for improved image generation quality, with gains in FID and Inception scores compared to state-of-the-art flow-based models.
arXiv Detail & Related papers (2020-04-08T09:07:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.