Farthest sampling segmentation of triangulated surfaces
- URL: http://arxiv.org/abs/2012.00478v1
- Date: Tue, 1 Dec 2020 13:31:44 GMT
- Title: Farthest sampling segmentation of triangulated surfaces
- Authors: Victoria Hern\'andez-Mederos, Dimas Mart\'inez, Jorge
Estrada-Sarlabous and Valia Guerra-Ones
- Abstract summary: Farthest Sampling (FSS) is a new method for segmentation of triangulated surfaces.
The FSS method does not depend on parameters that must be tuned by hand and it is very flexible.
Numerical experiments with several metrics and a large variety of 3D triangular meshes show that the segmentations obtained computing less than the 10% of columns $W$ are as good as those obtained from clustering the rows of the full matrix $W$.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we introduce Farthest Sampling Segmentation (FSS), a new method
for segmentation of triangulated surfaces, which consists of two fundamental
steps: the computation of a submatrix $W^k$ of the affinity matrix $W$ and the
application of the k-means clustering algorithm to the rows of $W^k$. The
submatrix $W^k$ is obtained computing the affinity between all triangles and
only a few special triangles: those which are farthest in the defined metric.
This is equivalent to select a sample of columns of $W$ without constructing it
completely. The proposed method is computationally cheaper than other
segmentation algorithms, since it only calculates few columns of $W$ and it
does not require the eigendecomposition of $W$ or of any submatrix of $W$.
We prove that the orthogonal projection of $W$ on the space generated by the
columns of $W^k$ coincides with the orthogonal projection of $W$ on the space
generated by the $k$ eigenvectors computed by Nystr\"om's method using the
columns of $W^k$ as a sample of $W$. Further, it is shown that for increasing
size $k$, the proximity relationship among the rows of $W^k$ tends to
faithfully reflect the proximity among the corresponding rows of $W$.
The FSS method does not depend on parameters that must be tuned by hand and
it is very flexible, since it can handle any metric to define the distance
between triangles. Numerical experiments with several metrics and a large
variety of 3D triangular meshes show that the segmentations obtained computing
less than the 10% of columns $W$ are as good as those obtained from clustering
the rows of the full matrix $W$.
Related papers
- Optimal Sketching for Residual Error Estimation for Matrix and Vector Norms [50.15964512954274]
We study the problem of residual error estimation for matrix and vector norms using a linear sketch.
We demonstrate that this gives a substantial advantage empirically, for roughly the same sketch size and accuracy as in previous work.
We also show an $Omega(k2/pn1-2/p)$ lower bound for the sparse recovery problem, which is tight up to a $mathrmpoly(log n)$ factor.
arXiv Detail & Related papers (2024-08-16T02:33:07Z) - Parameterized Approximation for Robust Clustering in Discrete Geometric Spaces [2.687607197645453]
We show that even the special case of $k$-Center in dimension $Theta(log n)$ is $(sqrt3/2- o(1))$hard to approximate for FPT algorithms.
We also show that even the special case of $k$-Center in dimension $Theta(log n)$ is $(sqrt3/2- o(1))$hard to approximate for FPT algorithms.
arXiv Detail & Related papers (2023-05-12T08:43:28Z) - Low-Rank Approximation with $1/\epsilon^{1/3}$ Matrix-Vector Products [58.05771390012827]
We study iterative methods based on Krylov subspaces for low-rank approximation under any Schatten-$p$ norm.
Our main result is an algorithm that uses only $tildeO(k/sqrtepsilon)$ matrix-vector products.
arXiv Detail & Related papers (2022-02-10T16:10:41Z) - Spectral properties of sample covariance matrices arising from random
matrices with independent non identically distributed columns [50.053491972003656]
It was previously shown that the functionals $texttr(AR(z))$, for $R(z) = (frac1nXXT- zI_p)-1$ and $Ain mathcal M_p$ deterministic, have a standard deviation of order $O(|A|_* / sqrt n)$.
Here, we show that $|mathbb E[R(z)] - tilde R(z)|_F
arXiv Detail & Related papers (2021-09-06T14:21:43Z) - Learning a Latent Simplex in Input-Sparsity Time [58.30321592603066]
We consider the problem of learning a latent $k$-vertex simplex $KsubsetmathbbRdtimes n$, given access to $AinmathbbRdtimes n$.
We show that the dependence on $k$ in the running time is unnecessary given a natural assumption about the mass of the top $k$ singular values of $A$.
arXiv Detail & Related papers (2021-05-17T16:40:48Z) - Deep Learning Meets Projective Clustering [66.726500395069]
A common approach for compressing NLP networks is to encode the embedding layer as a matrix $AinmathbbRntimes d$.
Inspired by emphprojective clustering from computational geometry, we suggest replacing this subspace by a set of $k$ subspaces.
arXiv Detail & Related papers (2020-10-08T22:47:48Z) - Compressed Deep Networks: Goodbye SVD, Hello Robust Low-Rank
Approximation [23.06440095688755]
A common technique for compressing a neural network is to compute the $k$-rank $ell$ approximation $A_k,2$ of the matrix $AinmathbbRntimes d$ that corresponds to a fully connected layer (or embedding layer)
Here, $d$ is the number of the neurons in the layer, $n$ is the number in the next one, and $A_k,2$ can be stored in $O(n+d)k)$ memory instead of $O(nd)$.
This
arXiv Detail & Related papers (2020-09-11T20:21:42Z) - The Average-Case Time Complexity of Certifying the Restricted Isometry
Property [66.65353643599899]
In compressed sensing, the restricted isometry property (RIP) on $M times N$ sensing matrices guarantees efficient reconstruction of sparse vectors.
We investigate the exact average-case time complexity of certifying the RIP property for $Mtimes N$ matrices with i.i.d. $mathcalN(0,1/M)$ entries.
arXiv Detail & Related papers (2020-05-22T16:55:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.