Point Cloud Classification via Deep Set Linearized Optimal Transport
- URL: http://arxiv.org/abs/2401.01460v1
- Date: Tue, 2 Jan 2024 23:26:33 GMT
- Title: Point Cloud Classification via Deep Set Linearized Optimal Transport
- Authors: Scott Mahan, Caroline Moosm\"uller, Alexander Cloninger
- Abstract summary: We introduce Deep Set Linearized Optimal Transport, an algorithm designed for the efficient simultaneous embedding of point clouds into an $L2-$space.
This embedding preserves specific low-dimensional structures within the Wasserstein space while constructing a classifier to distinguish between various classes of point clouds.
We showcase the advantages of our algorithm over the standard deep set approach through experiments on a flow dataset with a limited number of labeled point clouds.
- Score: 51.99765487172328
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Deep Set Linearized Optimal Transport, an algorithm designed for
the efficient simultaneous embedding of point clouds into an $L^2-$space. This
embedding preserves specific low-dimensional structures within the Wasserstein
space while constructing a classifier to distinguish between various classes of
point clouds. Our approach is motivated by the observation that $L^2-$distances
between optimal transport maps for distinct point clouds, originating from a
shared fixed reference distribution, provide an approximation of the
Wasserstein-2 distance between these point clouds, under certain assumptions.
To learn approximations of these transport maps, we employ input convex neural
networks (ICNNs) and establish that, under specific conditions, Euclidean
distances between samples from these ICNNs closely mirror Wasserstein-2
distances between the true distributions. Additionally, we train a
discriminator network that attaches weights these samples and creates a
permutation invariant classifier to differentiate between different classes of
point clouds. We showcase the advantages of our algorithm over the standard
deep set approach through experiments on a flow cytometry dataset with a
limited number of labeled point clouds.
Related papers
- Canonical Variates in Wasserstein Metric Space [16.668946904062032]
We employ the Wasserstein metric to measure distances between distributions, which are then used by distance-based classification algorithms.
Central to our investigation is dimension reduction within the Wasserstein metric space to enhance classification accuracy.
We introduce a novel approach grounded in the principle of maximizing Fisher's ratio, defined as the quotient of between-class variation to within-class variation.
arXiv Detail & Related papers (2024-05-24T17:59:21Z) - Multiway Point Cloud Mosaicking with Diffusion and Global Optimization [74.3802812773891]
We introduce a novel framework for multiway point cloud mosaicking (named Wednesday)
At the core of our approach is ODIN, a learned pairwise registration algorithm that identifies overlaps and refines attention scores.
Tested on four diverse, large-scale datasets, our method state-of-the-art pairwise and rotation registration results by a large margin on all benchmarks.
arXiv Detail & Related papers (2024-03-30T17:29:13Z) - Density-invariant Features for Distant Point Cloud Registration [29.68594463362292]
Group-wise Contrastive Learning (GCL) scheme to extract density-invariant geometric features.
We propose a simple yet effective training scheme to force the feature of multiple point clouds in the same spatial location to be similar.
The resulting fully-convolutional feature extractor is more powerful and density-invariant than state-of-the-art methods.
arXiv Detail & Related papers (2023-07-19T07:11:45Z) - Diffeomorphic Mesh Deformation via Efficient Optimal Transport for Cortical Surface Reconstruction [40.73187749820041]
Mesh deformation plays a pivotal role in many 3D vision tasks including dynamic simulations, rendering, and reconstruction.
A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance.
We propose a novel metric for learning mesh deformation, defined by sliced Wasserstein distance on meshes represented as probability measures that generalize the set-based approach.
arXiv Detail & Related papers (2023-05-27T19:10:19Z) - Efficient Graph Field Integrators Meet Point Clouds [59.27295475120132]
We present two new classes of algorithms for efficient field integration on graphs encoding point clouds.
The first class, SeparatorFactorization(SF), leverages the bounded genus of point cloud mesh graphs, while the second class, RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations for point clouds.
arXiv Detail & Related papers (2023-02-02T08:33:36Z) - GeONet: a neural operator for learning the Wasserstein geodesic [13.468026138183623]
We present GeONet, a mesh-invariant deep neural operator network that learns the non-linear mapping from the input pair of initial and terminal distributions to the Wasserstein geodesic connecting the two endpoint distributions.
We demonstrate that GeONet achieves comparable testing accuracy to the standard OT solvers on simulation examples and the MNIST dataset with considerably reduced inference-stage computational cost by orders of magnitude.
arXiv Detail & Related papers (2022-09-28T21:55:40Z) - Self-Supervised Arbitrary-Scale Point Clouds Upsampling via Implicit
Neural Representation [79.60988242843437]
We propose a novel approach that achieves self-supervised and magnification-flexible point clouds upsampling simultaneously.
Experimental results demonstrate that our self-supervised learning based scheme achieves competitive or even better performance than supervised learning based state-of-the-art methods.
arXiv Detail & Related papers (2022-04-18T07:18:25Z) - Differentiable Convolution Search for Point Cloud Processing [114.66038862207118]
We propose a novel differential convolution search paradigm on point clouds.
It can work in a purely data-driven manner and thus is capable of auto-creating a group of suitable convolutions for geometric shape modeling.
We also propose a joint optimization framework for simultaneous search of internal convolution and external architecture, and introduce epsilon-greedy algorithm to alleviate the effect of discretization error.
arXiv Detail & Related papers (2021-08-29T14:42:03Z) - PU-Flow: a Point Cloud Upsampling Networkwith Normalizing Flows [58.96306192736593]
We present PU-Flow, which incorporates normalizing flows and feature techniques to produce dense points uniformly distributed on the underlying surface.
Specifically, we formulate the upsampling process as point in a latent space, where the weights are adaptively learned from local geometric context.
We show that our method outperforms state-of-the-art deep learning-based approaches in terms of reconstruction quality, proximity-to-surface accuracy, and computation efficiency.
arXiv Detail & Related papers (2021-07-13T07:45:48Z) - DeepCLR: Correspondence-Less Architecture for Deep End-to-End Point
Cloud Registration [12.471564670462344]
This work addresses the problem of point cloud registration using deep neural networks.
We propose an approach to predict the alignment between two point clouds with overlapping data content, but displaced origins.
Our approach achieves state-of-the-art accuracy and the lowest run-time of the compared methods.
arXiv Detail & Related papers (2020-07-22T08:20:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.