A Multilayer Framework for Online Metric Learning
- URL: http://arxiv.org/abs/1805.05510v3
- Date: Thu, 24 Aug 2023 05:37:44 GMT
- Title: A Multilayer Framework for Online Metric Learning
- Authors: Wenbin Li, Yanfang Liu, Jing Huo, Yinghuan Shi, Yang Gao, Lei Wang and
Jiebo Luo
- Abstract summary: This paper proposes a multilayer framework for online metric learning to capture the nonlinear similarities among instances.
A new Mahalanobis-based Online Metric Learning (MOML) algorithm is presented based on the passive-aggressive strategy and one-pass triplet construction strategy.
The proposed MLOML enjoys several nice properties, indeed learns a metric progressively, and performs better on the benchmark datasets.
- Score: 71.31889711244739
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Online metric learning has been widely applied in classification and
retrieval. It can automatically learn a suitable metric from data by
restricting similar instances to be separated from dissimilar instances with a
given margin. However, the existing online metric learning algorithms have
limited performance in real-world classifications, especially when data
distributions are complex. To this end, this paper proposes a multilayer
framework for online metric learning to capture the nonlinear similarities
among instances. Different from the traditional online metric learning, which
can only learn one metric space, the proposed Multi-Layer Online Metric
Learning (MLOML) takes an online metric learning algorithm as a metric layer
and learns multiple hierarchical metric spaces, where each metric layer follows
a nonlinear layers for the complicated data distribution. Moreover, the forward
propagation (FP) strategy and backward propagation (BP) strategy are employed
to train the hierarchical metric layers. To build a metric layer of the
proposed MLOML, a new Mahalanobis-based Online Metric Learning (MOML) algorithm
is presented based on the passive-aggressive strategy and one-pass triplet
construction strategy. Furthermore, in a progressively and nonlinearly learning
way, MLOML has a stronger learning ability than traditional online metric
learning in the case of limited available training data. To make the learning
process more explainable and theoretically guaranteed, theoretical analysis is
provided. The proposed MLOML enjoys several nice properties, indeed learns a
metric progressively, and performs better on the benchmark datasets. Extensive
experiments with different settings have been conducted to verify these
properties of the proposed MLOML.
Related papers
- ConML: A Universal Meta-Learning Framework with Task-Level Contrastive Learning [49.447777286862994]
ConML is a universal meta-learning framework that can be applied to various meta-learning algorithms.
We demonstrate that ConML integrates seamlessly with optimization-based, metric-based, and amortization-based meta-learning algorithms.
arXiv Detail & Related papers (2024-10-08T12:22:10Z) - Performance Law of Large Language Models [58.32539851241063]
Performance law can be used to guide the choice of LLM architecture and the effective allocation of computational resources.
Performance law can be used to guide the choice of LLM architecture and the effective allocation of computational resources without extensive experiments.
arXiv Detail & Related papers (2024-08-19T11:09:12Z) - Few-shot Metric Learning: Online Adaptation of Embedding for Retrieval [37.601607544184915]
Metric learning aims to build a distance metric typically by learning an effective embedding function that maps similar objects into nearby points.
Despite recent advances in deep metric learning, it remains challenging for the learned metric to generalize to unseen classes with a substantial domain gap.
We propose a new problem of few-shot metric learning that aims to adapt the embedding function to the target domain with only a few annotated data.
arXiv Detail & Related papers (2022-11-14T05:10:17Z) - Classification Performance Metric Elicitation and its Applications [5.5637552942511155]
Despite its practical interest, there is limited formal guidance on how to select metrics for machine learning applications.
This thesis outlines metric elicitation as a principled framework for selecting the performance metric that best reflects implicit user preferences.
arXiv Detail & Related papers (2022-08-19T03:57:17Z) - Adaptive neighborhood Metric learning [184.95321334661898]
We propose a novel distance metric learning algorithm, named adaptive neighborhood metric learning (ANML)
ANML can be used to learn both the linear and deep embeddings.
The emphlog-exp mean function proposed in our method gives a new perspective to review the deep metric learning methods.
arXiv Detail & Related papers (2022-01-20T17:26:37Z) - A Framework to Enhance Generalization of Deep Metric Learning methods
using General Discriminative Feature Learning and Class Adversarial Neural
Networks [1.5469452301122175]
Metric learning algorithms aim to learn a distance function that brings semantically similar data items together and keeps dissimilar ones at a distance.
Deep Metric Learning (DML) methods are proposed that automatically extract features from data and learn a non-linear transformation from input space to a semantically embedding space.
We propose a framework to enhance the generalization power of existing DML methods in a Zero-Shot Learning (ZSL) setting.
arXiv Detail & Related papers (2021-06-11T14:24:40Z) - Memory-Based Optimization Methods for Model-Agnostic Meta-Learning and
Personalized Federated Learning [56.17603785248675]
Model-agnostic meta-learning (MAML) has become a popular research area.
Existing MAML algorithms rely on the episode' idea by sampling a few tasks and data points to update the meta-model at each iteration.
This paper proposes memory-based algorithms for MAML that converge with vanishing error.
arXiv Detail & Related papers (2021-06-09T08:47:58Z) - Towards Self-Adaptive Metric Learning On the Fly [16.61982837441342]
We aim to address the open challenge of "Online Adaptive Metric Learning" (OAML) for learning adaptive metric functions on the fly.
Unlike traditional online metric learning methods, OAML is significantly more challenging since the learned metric could be non-linear and the model has to be self-adaptive.
We present a new online metric learning framework that attempts to tackle the challenge by learning an ANN-based metric with adaptive model complexity from a stream of constraints.
arXiv Detail & Related papers (2021-04-03T23:11:52Z) - ECML: An Ensemble Cascade Metric Learning Mechanism towards Face
Verification [50.137924223702264]
In particular, hierarchical metric learning is executed in the cascade way to alleviate underfitting.
Considering the feature distribution characteristics of faces, a robust Mahalanobis metric learning method (RMML) with closed-form solution is additionally proposed.
EC-RMML is superior to state-of-the-art metric learning methods for face verification.
arXiv Detail & Related papers (2020-07-11T08:47:07Z) - Online Metric Learning for Multi-Label Classification [22.484707213499714]
We propose a novel online metric learning paradigm for multi-label classification.
We first propose a new metric for multi-label classification based on $k$-Nearest Neighbour ($k$NN)
arXiv Detail & Related papers (2020-06-12T11:33:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.