On a minimum enclosing ball of a collection of linear subspaces
- URL: http://arxiv.org/abs/2003.12455v1
- Date: Fri, 27 Mar 2020 15:00:49 GMT
- Title: On a minimum enclosing ball of a collection of linear subspaces
- Authors: Timothy Marrinan, P.-A. Absil, Nicolas Gillis
- Abstract summary: This paper concerns the minimax center of a collection of linear subspaces.
We propose and solve an optimization problem parametrized by the rank of the minimax center.
By scaling the objective and penalizing the information lost by the rank-$k$ minimax center, we jointly recover an optimal dimension, $k*$, and a central subspace, $U* in$ Gr$(k*,n)$ at the center of the minimum enclosing ball, that best represents the data.
- Score: 16.466372560527827
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper concerns the minimax center of a collection of linear subspaces.
When the subspaces are $k$-dimensional subspaces of $\mathbb{R}^n$, this can be
cast as finding the center of a minimum enclosing ball on a Grassmann manifold,
Gr$(k,n)$. For subspaces of different dimension, the setting becomes a disjoint
union of Grassmannians rather than a single manifold, and the problem is no
longer well-defined. However, natural geometric maps exist between these
manifolds with a well-defined notion of distance for the images of the
subspaces under the mappings. Solving the initial problem in this context leads
to a candidate minimax center on each of the constituent manifolds, but does
not inherently provide intuition about which candidate is the best
representation of the data. Additionally, the solutions of different rank are
generally not nested so a deflationary approach will not suffice, and the
problem must be solved independently on each manifold. We propose and solve an
optimization problem parametrized by the rank of the minimax center. The
solution is computed using a subgradient algorithm on the dual. By scaling the
objective and penalizing the information lost by the rank-$k$ minimax center,
we jointly recover an optimal dimension, $k^*$, and a central subspace, $U^*
\in$ Gr$(k^*,n)$ at the center of the minimum enclosing ball, that best
represents the data.
Related papers
- Nearly Minimax Optimal Regret for Learning Linear Mixture Stochastic
Shortest Path [80.60592344361073]
We study the Shortest Path (SSP) problem with a linear mixture transition kernel.
An agent repeatedly interacts with a environment and seeks to reach certain goal state while minimizing the cumulative cost.
Existing works often assume a strictly positive lower bound of the iteration cost function or an upper bound of the expected length for the optimal policy.
arXiv Detail & Related papers (2024-02-14T07:52:00Z) - On Generalization Bounds for Projective Clustering [3.4490841399255823]
Given a set of points, clustering consists of finding a partition of a point set into $k$ clusters such that the center to which a point is assigned is as close as possible.
For center-based objectives, we show a convergence rate of $tildeOleft(sqrtfrackj2nright)$.
For subspace clustering with $j$-dimensional subspaces, we show a convergence rate of $tildeOleft(sqrtfrackj2nright)$.
arXiv Detail & Related papers (2023-10-13T14:15:54Z) - Decentralized Riemannian Algorithm for Nonconvex Minimax Problems [82.50374560598493]
The minimax algorithms for neural networks have been developed to solve many problems.
In this paper, we propose two types of minimax algorithms.
For the setting, we propose DRSGDA and prove that our method achieves a gradient.
arXiv Detail & Related papers (2023-02-08T01:42:45Z) - Pseudonorm Approachability and Applications to Regret Minimization [73.54127663296906]
We convert high-dimensional $ell_infty$-approachability problems to low-dimensional pseudonorm approachability problems.
We develop an algorithmic theory of pseudonorm approachability, analogous to previous work on approachability for $ell$ and other norms.
arXiv Detail & Related papers (2023-02-03T03:19:14Z) - Best Policy Identification in Linear MDPs [70.57916977441262]
We investigate the problem of best identification in discounted linear Markov+Delta Decision in the fixed confidence setting under a generative model.
The lower bound as the solution of an intricate non- optimization program can be used as the starting point to devise such algorithms.
arXiv Detail & Related papers (2022-08-11T04:12:50Z) - Faster Perturbed Stochastic Gradient Methods for Finding Local Minima [92.99933928528797]
We propose tttPullback, a faster perturbed gradient framework for finding local minima.
We show that Pullback with gradient estimators such as SARAH/SP and STORM can find $(epsilon, epsilon_H)$approximate local minima within $tilde O(epsilon-3 + H-6)$.
The core idea of our framework is a step-size pullback'' scheme to control the average movement of the gradient evaluations.
arXiv Detail & Related papers (2021-10-25T07:20:05Z) - Dimensionality Reduction for Wasserstein Barycenter [6.327655795051619]
We study dimensionality reduction techniques for the Wasserstein barycenter problem.
We show that randomized dimensionality reduction can be used to map the problem to a space of dimension $O(log n)$ independent of both $d$ and $k$.
We also provide a coreset construction for the Wasserstein barycenter problem that significantly decreases the number of input distributions.
arXiv Detail & Related papers (2021-10-18T02:57:25Z) - Randomized Dimensionality Reduction for Facility Location and
Single-Linkage Clustering [13.208510864854894]
Random dimensionality reduction is a versatile tool for speeding up algorithms for high-dimensional problems.
We study its application to two clustering problems: the facility location problem, and the single-linkage hierarchical clustering problem.
arXiv Detail & Related papers (2021-07-05T05:55:26Z) - Geometry of the Loss Landscape in Overparameterized Neural Networks:
Symmetries and Invariances [9.390008801320024]
We show that adding one extra neuron to each is sufficient to connect all previously discrete minima into a single manifold.
We show that the number of symmetry-induced critical subspaces dominates the number of affine subspaces forming the global minima manifold.
arXiv Detail & Related papers (2021-05-25T21:19:07Z) - A deep network construction that adapts to intrinsic dimensionality
beyond the domain [79.23797234241471]
We study the approximation of two-layer compositions $f(x) = g(phi(x))$ via deep networks with ReLU activation.
We focus on two intuitive and practically relevant choices for $phi$: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets.
arXiv Detail & Related papers (2020-08-06T09:50:29Z) - Coreset-based Strategies for Robust Center-type Problems [0.6875312133832077]
We devise coreset-based strategies for the two problems which yield efficient sequential, MapReduce, and Streaming algorithms.
For wide ranges of the parameters $k,zepsilon, D$, we obtain a sequential algorithm with running time linear in $|V|$, and MapReduce/Streaming algorithms with few rounds/passes and substantially sublinear local/working memory.
arXiv Detail & Related papers (2020-02-18T10:04:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.