Self-Supervised Learning by Curvature Alignment
- URL: http://arxiv.org/abs/2511.17426v1
- Date: Fri, 21 Nov 2025 17:22:31 GMT
- Title: Self-Supervised Learning by Curvature Alignment
- Authors: Benyamin Ghojogh, M. Hadi Sepanj, Paul Fieguth,
- Abstract summary: CurvSSL is a curvature-regularized self-supervised learning framework.<n>We introduce CurvSSL, a curvature-regularized self-supervised learning framework.<n>We show that explicitly shaping local geometry is a simple and effective complement to purely statistical SSL regularizers.
- Score: 0.15293427903448018
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Self-supervised learning (SSL) has recently advanced through non-contrastive methods that couple an invariance term with variance, covariance, or redundancy-reduction penalties. While such objectives shape first- and second-order statistics of the representation, they largely ignore the local geometry of the underlying data manifold. In this paper, we introduce CurvSSL, a curvature-regularized self-supervised learning framework, and its RKHS extension, kernel CurvSSL. Our approach retains a standard two-view encoder-projector architecture with a Barlow Twins-style redundancy-reduction loss on projected features, but augments it with a curvature-based regularizer. Each embedding is treated as a vertex whose $k$ nearest neighbors define a discrete curvature score via cosine interactions on the unit hypersphere; in the kernel variant, curvature is computed from a normalized local Gram matrix in an RKHS. These scores are aligned and decorrelated across augmentations by a Barlow-style loss on a curvature-derived matrix, encouraging both view invariance and consistency of local manifold bending. Experiments on MNIST and CIFAR-10 datasets with a ResNet-18 backbone show that curvature-regularized SSL yields competitive or improved linear evaluation performance compared to Barlow Twins and VICReg. Our results indicate that explicitly shaping local geometry is a simple and effective complement to purely statistical SSL regularizers.
Related papers
- Graph-based Clustering Revisited: A Relaxation of Kernel $k$-Means Perspective [73.18641268511318]
We propose a graph-based clustering algorithm that only relaxes the orthonormal constraint to derive clustering results.<n>To ensure a doubly constraint into a gradient, we transform the non-negative constraint into a class probability parameter.
arXiv Detail & Related papers (2025-09-23T09:14:39Z) - Curvature Learning for Generalization of Hyperbolic Neural Networks [51.888534247573894]
Hyperbolic neural networks (HNNs) have demonstrated notable efficacy in representing real-world data with hierarchical structures.<n>Inappropriate curvatures may cause HNNs to converge to suboptimal parameters, degrading overall performance.<n>We propose a sharpness-aware curvature learning method to smooth the loss landscape, thereby improving the generalization of HNNs.
arXiv Detail & Related papers (2025-08-24T07:14:30Z) - Calibrating Biased Distribution in VFM-derived Latent Space via Cross-Domain Geometric Consistency [52.52950138164424]
We show that when leveraging the off-the-shelf (vision) foundation models for feature extraction, the geometric shapes of the resulting feature distributions exhibit remarkable transferability across domains and datasets.<n>We embody our geometric knowledge-guided distribution calibration framework in two popular and challenging settings: federated learning and long-tailed recognition.<n>In long-tailed learning, it utilizes the geometric knowledge transferred from sample-rich categories to recover the true distribution for sample-scarce tail classes.
arXiv Detail & Related papers (2025-08-19T05:22:59Z) - Aggregation on Learnable Manifolds for Asynchronous Federated Optimization [3.8208848658169763]
We introduce a geometric framework that casts aggregation as curve learning.<n>Within this, we propose AsyncBezier, which replaces linear aggregation with low-degree curvature components.<n>We show that these gains are preserved even when other methods are allocated a higher local compute budget.
arXiv Detail & Related papers (2025-03-18T16:36:59Z) - CurvGAD: Leveraging Curvature for Enhanced Graph Anomaly Detection [23.643189106137008]
We propose CurvGAD - a mixed-curvature graph autoencoder that introduces the notion of curvature-based geometric anomalies.<n>CurvGAD introduces two parallel pipelines for enhanced anomaly interpretability.<n>Experiments over 10 real-world datasets demonstrate an improvement of up to 6.5% over state-of-the-art GAD methods.
arXiv Detail & Related papers (2025-02-12T17:49:46Z) - Leveraging CORAL-Correlation Consistency Network for Semi-Supervised Left Atrium MRI Segmentation [14.296441810235223]
Semi-supervised learning (SSL) has been widely used to learn from both a few labeled images and many unlabeled images.
Most current SSL-based segmentation methods use pixel values directly to identify similar features in labeled and unlabeled data.
We introduce CORAL(Correlation-Aligned)-Correlation Consistency Network (CORN) to capture the global structure shape and local details of Left Atrium.
arXiv Detail & Related papers (2024-10-21T11:46:28Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - Last-Iterate Convergence of Adaptive Riemannian Gradient Descent for Equilibrium Computation [52.73824786627612]
This paper establishes new convergence results for textitgeodesic strongly monotone games.<n>Our key result shows that RGD attains last-iterate linear convergence in a textitgeometry-agnostic fashion.<n>Overall, this paper presents the first geometry-agnostic last-iterate convergence analysis for games beyond the Euclidean settings.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Machine learning algorithms for three-dimensional mean-curvature
computation in the level-set method [0.0]
We propose a data-driven mean-curvature solver for the level-set method.
Our proposed system can yield more accurate mean-curvature estimations than modern particle-based interface reconstruction.
arXiv Detail & Related papers (2022-08-18T20:19:22Z) - Rethinking the Zigzag Flattening for Image Reading [48.976491898131265]
We investigate the Hilbert fractal flattening (HF) as another method for sequence ordering in computer vision.
The HF has proven to be superior to other curves in maintaining spatial locality.
It can be easily plugged into most deep neural networks (DNNs)
arXiv Detail & Related papers (2022-02-21T13:53:04Z) - Error-Correcting Neural Networks for Two-Dimensional Curvature
Computation in the Level-Set Method [0.0]
We present an error-neural-modeling-based strategy for approximating two-dimensional curvature in the level-set method.
Our main contribution is a redesigned hybrid solver that relies on numerical schemes to enable machine-learning operations on demand.
arXiv Detail & Related papers (2022-01-22T05:14:40Z) - Cogradient Descent for Dependable Learning [64.02052988844301]
We propose a dependable learning based on Cogradient Descent (CoGD) algorithm to address the bilinear optimization problem.
CoGD is introduced to solve bilinear problems when one variable is with sparsity constraint.
It can also be used to decompose the association of features and weights, which further generalizes our method to better train convolutional neural networks (CNNs)
arXiv Detail & Related papers (2021-06-20T04:28:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.