MLAS: Metric Learning on Attributed Sequences
- URL: http://arxiv.org/abs/2011.04062v1
- Date: Sun, 8 Nov 2020 19:35:42 GMT
- Title: MLAS: Metric Learning on Attributed Sequences
- Authors: Zhongfang Zhuang, Xiangnan Kong, Elke Rundensteiner, Jihane Zouaoui,
Aditya Arora
- Abstract summary: Conventional approaches to metric learning mainly focus on learning the Mahalanobis distance metric on data attributes.
We propose a deep learning framework, called MLAS, to learn a distance metric that effectively measures dissimilarities between attributed sequences.
- Score: 13.689383530299502
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Distance metric learning has attracted much attention in recent years, where
the goal is to learn a distance metric based on user feedback. Conventional
approaches to metric learning mainly focus on learning the Mahalanobis distance
metric on data attributes. Recent research on metric learning has been extended
to sequential data, where we only have structural information in the sequences,
but no attribute is available. However, real-world applications often involve
attributed sequence data (e.g., clickstreams), where each instance consists of
not only a set of attributes (e.g., user session context) but also a sequence
of categorical items (e.g., user actions). In this paper, we study the problem
of metric learning on attributed sequences. Unlike previous work on metric
learning, we now need to go beyond the Mahalanobis distance metric in the
attribute feature space while also incorporating the structural information in
sequences. We propose a deep learning framework, called MLAS (Metric Learning
on Attributed Sequences), to learn a distance metric that effectively measures
dissimilarities between attributed sequences. Empirical results on real-world
datasets demonstrate that the proposed MLAS framework significantly improves
the performance of metric learning compared to state-of-the-art methods on
attributed sequences.
Related papers
- AttriCLIP: A Non-Incremental Learner for Incremental Knowledge Learning [53.32576252950481]
Continual learning aims to enable a model to incrementally learn knowledge from sequentially arrived data.
In this paper, we propose a non-incremental learner, named AttriCLIP, to incrementally extract knowledge of new classes or tasks.
arXiv Detail & Related papers (2023-05-19T07:39:17Z) - Exogenous Data in Forecasting: FARM -- A New Measure for Relevance
Evaluation [62.997667081978825]
We introduce a new approach named FARM - Forward Relevance Aligned Metric.
Our forward method relies on an angular measure that compares changes in subsequent data points to align time-warped series.
As a first validation step, we present the application of our FARM approach to synthetic but representative signals.
arXiv Detail & Related papers (2023-04-21T15:22:33Z) - Metric Learning Improves the Ability of Combinatorial Coverage Metrics
to Anticipate Classification Error [0.0]
Many machine learning methods are sensitive to test or operational data that is dissimilar to training data.
metric learning is a technique for learning latent spaces where data from different classes is further apart.
In a study of 6 open-source datasets, we find that metric learning increased the difference between set-difference coverage metrics calculated on correctly and incorrectly classified data.
arXiv Detail & Related papers (2023-02-28T14:55:57Z) - Few-shot Metric Learning: Online Adaptation of Embedding for Retrieval [37.601607544184915]
Metric learning aims to build a distance metric typically by learning an effective embedding function that maps similar objects into nearby points.
Despite recent advances in deep metric learning, it remains challenging for the learned metric to generalize to unseen classes with a substantial domain gap.
We propose a new problem of few-shot metric learning that aims to adapt the embedding function to the target domain with only a few annotated data.
arXiv Detail & Related papers (2022-11-14T05:10:17Z) - Self-Taught Metric Learning without Labels [47.832107446521626]
We present a novel self-taught framework for unsupervised metric learning.
It alternates between predicting class-equivalence relations between data through a moving average of an embedding model and learning the model with the predicted relations as pseudo labels.
arXiv Detail & Related papers (2022-05-04T05:48:40Z) - Deep Relational Metric Learning [84.95793654872399]
This paper presents a deep relational metric learning framework for image clustering and retrieval.
We learn an ensemble of features that characterizes an image from different aspects to model both interclass and intraclass distributions.
Experiments on the widely-used CUB-200-2011, Cars196, and Stanford Online Products datasets demonstrate that our framework improves existing deep metric learning methods and achieves very competitive results.
arXiv Detail & Related papers (2021-08-23T09:31:18Z) - Finding Significant Features for Few-Shot Learning using Dimensionality
Reduction [0.0]
This module helps to improve the accuracy performance by allowing the similarity function, given by the metric learning method, to have more discriminative features for the classification.
Our method outperforms the metric learning baselines in the miniImageNet dataset by around 2% in accuracy performance.
arXiv Detail & Related papers (2021-07-06T16:36:57Z) - ECML: An Ensemble Cascade Metric Learning Mechanism towards Face
Verification [50.137924223702264]
In particular, hierarchical metric learning is executed in the cascade way to alleviate underfitting.
Considering the feature distribution characteristics of faces, a robust Mahalanobis metric learning method (RMML) with closed-form solution is additionally proposed.
EC-RMML is superior to state-of-the-art metric learning methods for face verification.
arXiv Detail & Related papers (2020-07-11T08:47:07Z) - Metric Learning for Ordered Labeled Trees with pq-grams [11.284638114256712]
We propose a new metric learning approach for tree-structured data with pq-grams.
The pq-gram distance is a distance for ordered labeled trees, and has much lower computation cost than the tree edit distance.
We empirically show that the proposed approach achieves competitive results with the state-of-the-art edit distance-based methods.
arXiv Detail & Related papers (2020-03-09T08:04:47Z) - Supervised Categorical Metric Learning with Schatten p-Norms [10.995886294197412]
We propose a method, called CPML for emphcategorical projected metric learning, to address the problem of metric learning in categorical data.
We make use of the Value Distance Metric to represent our data and propose new distances based on this representation.
We then show how to efficiently learn new metrics.
arXiv Detail & Related papers (2020-02-26T01:17:12Z) - A Multilayer Framework for Online Metric Learning [71.31889711244739]
This paper proposes a multilayer framework for online metric learning to capture the nonlinear similarities among instances.
A new Mahalanobis-based Online Metric Learning (MOML) algorithm is presented based on the passive-aggressive strategy and one-pass triplet construction strategy.
The proposed MLOML enjoys several nice properties, indeed learns a metric progressively, and performs better on the benchmark datasets.
arXiv Detail & Related papers (2018-05-15T01:10:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.