Riemannian Patch Assignment Gradient Flows
- URL: http://arxiv.org/abs/2504.13024v2
- Date: Tue, 22 Apr 2025 19:54:20 GMT
- Title: Riemannian Patch Assignment Gradient Flows
- Authors: Daniel Gonzalez-Alvarado, Fabio Schlindwein, Jonas Cassel, Laura Steingruber, Stefania Petra, Christoph Schnörr,
- Abstract summary: This paper introduces patch assignment flows for metric data labeling on graphs.<n>Labelings are determined by regularizing initial local labelings through the dynamic interaction of both labels and label assignments.
- Score: 1.715270928578365
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces patch assignment flows for metric data labeling on graphs. Labelings are determined by regularizing initial local labelings through the dynamic interaction of both labels and label assignments across the graph, entirely encoded by a dictionary of competing labeled patches and mediated by patch assignment variables. Maximal consistency of patch assignments is achieved by geometric numerical integration of a Riemannian ascent flow, as critical point of a Lagrangian action functional. Experiments illustrate properties of the approach, including uncertainty quantification of label assignments.
Related papers
- Optimized 3D Point Labeling with Leaders Using the Beams Displacement Method [14.377997862577182]
Leadered labels have a large degree of freedom in position con-figuration.<n>We conceptualize the dynamic configuration process of computing label positions as akin to solving a map displacement problem.
arXiv Detail & Related papers (2024-06-28T01:31:37Z) - Posterior Label Smoothing for Node Classification [2.737276507021477]
We propose a simple yet effective label smoothing for the transductive node classification task.
We design the soft label to encapsulate the local context of the target node through the neighborhood label distribution.
In the following analysis, we find that incorporating global label statistics in posterior computation is the key to the success of label smoothing.
arXiv Detail & Related papers (2024-06-01T11:59:49Z) - Inaccurate Label Distribution Learning with Dependency Noise [52.08553913094809]
We introduce the Dependent Noise-based Inaccurate Label Distribution Learning (DN-ILDL) framework to tackle the challenges posed by noise in label distribution learning.
We show that DN-ILDL effectively addresses the ILDL problem and outperforms existing LDL methods.
arXiv Detail & Related papers (2024-05-26T07:58:07Z) - Influence Functions for Sequence Tagging Models [49.81774968547377]
We extend influence functions to trace predictions back to the training points that informed them.
We show the practical utility of segment influence by using the method to identify systematic annotation errors.
arXiv Detail & Related papers (2022-10-25T17:13:11Z) - Label Distribution Learning via Implicit Distribution Representation [12.402054374952485]
In this paper, we introduce the implicit distribution in the label distribution learning framework to characterize the uncertainty of each label value.
Specifically, we use deep implicit representation learning to construct a label distribution matrix with Gaussian prior constraints.
Each row component of the label distribution matrix is transformed into a standard label distribution form by using the self-attention algorithm.
arXiv Detail & Related papers (2022-09-28T04:13:53Z) - Multi-label Classification with High-rank and High-order Label
Correlations [62.39748565407201]
Previous methods capture the high-order label correlations mainly by transforming the label matrix to a latent label space with low-rank matrix factorization.
We propose a simple yet effective method to depict the high-order label correlations explicitly, and at the same time maintain the high-rank of the label matrix.
Comparative studies over twelve benchmark data sets validate the effectiveness of the proposed algorithm in multi-label classification.
arXiv Detail & Related papers (2022-07-09T05:15:31Z) - Graph Attention Transformer Network for Multi-Label Image Classification [50.0297353509294]
We propose a general framework for multi-label image classification that can effectively mine complex inter-label relationships.
Our proposed methods can achieve state-of-the-art performance on three datasets.
arXiv Detail & Related papers (2022-03-08T12:39:05Z) - Why Propagate Alone? Parallel Use of Labels and Features on Graphs [42.01561812621306]
Graph neural networks (GNNs) and label propagation represent two interrelated modeling strategies designed to exploit graph structure in tasks such as node property prediction.
We show that a label trick can be reduced to an interpretable, deterministic training objective composed of two factors.
arXiv Detail & Related papers (2021-10-14T07:34:11Z) - GRACE: Gradient Harmonized and Cascaded Labeling for Aspect-based
Sentiment Analysis [90.43089622630258]
We propose a GRadient hArmonized and CascadEd labeling model (GRACE) to solve these problems.
The proposed model achieves consistency improvement on multiple benchmark datasets and generates state-of-the-art results.
arXiv Detail & Related papers (2020-09-22T13:55:34Z) - Assignment Flows for Data Labeling on Graphs: Convergence and Stability [69.68068088508505]
This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings)
Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.
arXiv Detail & Related papers (2020-02-26T15:45:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.