A Survey on Concept Factorization: From Shallow to Deep Representation
Learning
- URL: http://arxiv.org/abs/2007.15840v3
- Date: Sun, 31 Jan 2021 08:45:58 GMT
- Title: A Survey on Concept Factorization: From Shallow to Deep Representation
Learning
- Authors: Zhao Zhang, Yan Zhang, Mingliang Xu, Li Zhang, Yi Yang, Shuicheng Yan
- Abstract summary: Concept Factorization (CF) has attracted a great deal of interests in the areas of machine learning and data mining.
We first re-view the root CF method, and then explore the advancement of CF-based representation learning.
We also introduce the potential application areas of CF-based methods.
- Score: 104.78577405792592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The quality of learned features by representation learning determines the
performance of learning algorithms and the related application tasks (such as
high-dimensional data clustering). As a relatively new paradigm for
representation learning, Concept Factorization (CF) has attracted a great deal
of interests in the areas of machine learning and data mining for over a
decade. Lots of effective CF based methods have been proposed based on
different perspectives and properties, but note that it still remains not easy
to grasp the essential connections and figure out the underlying explanatory
factors from exiting studies. In this paper, we therefore survey the recent
advances on CF methodologies and the potential benchmarks by categorizing and
summarizing the current methods. Specifically, we first re-view the root CF
method, and then explore the advancement of CF-based representation learning
ranging from shallow to deep/multilayer cases. We also introduce the potential
application areas of CF-based methods. Finally, we point out some future
directions for studying the CF-based representation learning. Overall, this
survey provides an insightful overview of both theoretical basis and current
developments in the field of CF, which can also help the interested researchers
to understand the current trends of CF and find the most appropriate CF
techniques to deal with particular applications.
Related papers
- High-Performance Few-Shot Segmentation with Foundation Models: An Empirical Study [64.06777376676513]
We develop a few-shot segmentation (FSS) framework based on foundation models.
To be specific, we propose a simple approach to extract implicit knowledge from foundation models to construct coarse correspondence.
Experiments on two widely used datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2024-09-10T08:04:11Z) - Coding for Intelligence from the Perspective of Category [66.14012258680992]
Coding targets compressing and reconstructing data, and intelligence.
Recent trends demonstrate the potential homogeneity of these two fields.
We propose a novel problem of Coding for Intelligence from the category theory view.
arXiv Detail & Related papers (2024-07-01T07:05:44Z) - Continual Learning with Pre-Trained Models: A Survey [61.97613090666247]
Continual Learning aims to overcome the catastrophic forgetting of former knowledge when learning new ones.
This paper presents a comprehensive survey of the latest advancements in PTM-based CL.
arXiv Detail & Related papers (2024-01-29T18:27:52Z) - Few-shot Class-incremental Learning: A Survey [16.729567512584822]
Few-shot Class-Incremental Learning (FSCIL) presents a unique challenge in Machine Learning (ML)
This paper aims to provide a comprehensive and systematic review of FSCIL.
arXiv Detail & Related papers (2023-08-13T13:01:21Z) - Semi-supervised multi-view concept decomposition [30.699496411869834]
Concept Factorization (CF) has demonstrated superior performance in multi-view clustering tasks.
We propose a novel semi-supervised multi-view concept factorization model, named SMVCF.
We conduct experiments on four diverse datasets to evaluate the performance of SMVCF.
arXiv Detail & Related papers (2023-07-03T10:50:44Z) - A Survey on Few-Shot Class-Incremental Learning [11.68962265057818]
Few-shot class-incremental learning (FSCIL) poses a significant challenge for deep neural networks to learn new tasks.
This paper provides a comprehensive survey on FSCIL.
FSCIL has achieved impressive achievements in various fields of computer vision.
arXiv Detail & Related papers (2023-04-17T10:15:08Z) - A Comprehensive Survey on Deep Clustering: Taxonomy, Challenges, and
Future Directions [48.97008907275482]
Clustering is a fundamental machine learning task which has been widely studied in the literature.
Deep Clustering, i.e., jointly optimizing the representation learning and clustering, has been proposed and hence attracted growing attention in the community.
We summarize the essential components of deep clustering and categorize existing methods by the ways they design interactions between deep representation learning and clustering.
arXiv Detail & Related papers (2022-06-15T15:05:13Z) - Which Mutual-Information Representation Learning Objectives are
Sufficient for Control? [80.2534918595143]
Mutual information provides an appealing formalism for learning representations of data.
This paper formalizes the sufficiency of a state representation for learning and representing the optimal policy.
Surprisingly, we find that two of these objectives can yield insufficient representations given mild and common assumptions on the structure of the MDP.
arXiv Detail & Related papers (2021-06-14T10:12:34Z) - A Theoretical Analysis of Catastrophic Forgetting through the NTK
Overlap Matrix [16.106653541368306]
We show that the impact of Catastrophic Forgetting increases as two tasks increasingly align.
We propose a variant of Orthogonal Gradient Descent (OGD) which leverages structure of the data.
Experiments support our theoretical findings and show how our method can help reduce CF on classical CL datasets.
arXiv Detail & Related papers (2020-10-07T17:35:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.