Spherical dimension
- URL: http://arxiv.org/abs/2503.10240v1
- Date: Thu, 13 Mar 2025 10:32:25 GMT
- Title: Spherical dimension
- Authors: Bogdan Chornomaz, Shay Moran, Tom Waknine,
- Abstract summary: Spherical dimension is a natural relaxation of the VC dimension.<n>Spherical dimension serves as a common foundation for leveraging the Borsuk-Ulam theorem and related topological tools.
- Score: 15.07787640047213
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We introduce and study the spherical dimension, a natural topological relaxation of the VC dimension that unifies several results in learning theory where topology plays a key role in the proofs. The spherical dimension is defined by extending the set of realizable datasets (used to define the VC dimension) to the continuous space of realizable distributions. In this space, a shattered set of size d (in the VC sense) is completed into a continuous object, specifically a d-dimensional sphere of realizable distributions. The spherical dimension is then defined as the dimension of the largest sphere in this space. Thus, the spherical dimension is at least the VC dimension. The spherical dimension serves as a common foundation for leveraging the Borsuk-Ulam theorem and related topological tools. We demonstrate the utility of the spherical dimension in diverse applications, including disambiguations of partial concept classes, reductions from classification to stochastic convex optimization, stability and replicability, and sample compression schemes. Perhaps surprisingly, we show that the open question posed by Alon, Hanneke, Holzman, and Moran (FOCS 2021) of whether there exist non-trivial disambiguations for halfspaces with margin is equivalent to the basic open question of whether the VC and spherical dimensions are finite together.
Related papers
- On consistent estimation of dimension values [45.52331418900137]
The problem of estimating the dimension of a compact subset S of the Euclidean space is considered.<n>The emphasis is put on consistency results in the statistical sense.
arXiv Detail & Related papers (2024-12-18T14:40:37Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Understanding and Mitigating Hyperbolic Dimensional Collapse in Graph Contrastive Learning [70.0681902472251]
We propose a novel contrastive learning framework to learn high-quality graph embeddings in hyperbolic space.<n>Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.<n>We show that in the hyperbolic space one has to address the leaf- and height-level uniformity related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Point-Dimension Theory (Part I): The Point-Extended Box Dimension [0.0]
This article is an introductory work to a larger research project devoted to pure, applied and philosophical aspects of dimension theory.
We propose two new ways of conceiving the notion of dimension, which are the two sides of the same coin.
In connection with Boltzmann and Shannon entropies, dimension appears essentially as a comparison between entropies of sets.
arXiv Detail & Related papers (2022-05-27T09:43:36Z) - Nested-sphere description of the N-level Chern number and the
generalized Bloch hypersphere [0.0]
The geometric interpretation of (pseudo)spin 1/2 systems on the Bloch sphere has been appreciated across different areas ranging from condensed matter to quantum information and high energy physics.
We here employ a coherence vector description to theoretically characterize a general N-level system on the higher dimensional generalized Bloch (hyper)sphere.
We demonstrate that for the N-level case, there is an exterior two-sphere that provides a useful characterization of the system, notably by playing a primary role in determining the Chern number.
arXiv Detail & Related papers (2021-10-13T18:00:01Z) - Pure Exploration in Kernel and Neural Bandits [90.23165420559664]
We study pure exploration in bandits, where the dimension of the feature representation can be much larger than the number of arms.
To overcome the curse of dimensionality, we propose to adaptively embed the feature representation of each arm into a lower-dimensional space.
arXiv Detail & Related papers (2021-06-22T19:51:59Z) - A Topological Approach to Inferring the Intrinsic Dimension of Convex
Sensing Data [0.0]
We consider a common measurement paradigm, where an unknown subset of an affine space is measured by unknown quasi- filtration functions.
In this paper, we develop a method for inferring the dimension of the data under natural assumptions.
arXiv Detail & Related papers (2020-07-07T05:35:23Z) - Geometry of Similarity Comparisons [51.552779977889045]
We show that the ordinal capacity of a space form is related to its dimension and the sign of its curvature.
More importantly, we show that the statistical behavior of the ordinal spread random variables defined on a similarity graph can be used to identify its underlying space form.
arXiv Detail & Related papers (2020-06-17T13:37:42Z) - Estimate of the Neural Network Dimension using Algebraic Topology and
Lie Theory [0.0]
In this paper we present an approach to determine the smallest possible number of neurons in a layer of a neural network.
We specify the required dimensions precisely, assuming that there is a smooth manifold on or near which the data are located.
We derive the theory and validate it experimentally on toy data sets.
arXiv Detail & Related papers (2020-04-06T14:15:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.