Homomorphisms Between Transfer, Multi-Task, and Meta-Learning Systems
- URL: http://arxiv.org/abs/2208.03316v1
- Date: Thu, 4 Aug 2022 18:13:59 GMT
- Title: Homomorphisms Between Transfer, Multi-Task, and Meta-Learning Systems
- Authors: Tyler Cody
- Abstract summary: This manuscript formalizes transfer learning, multi-task learning, and meta-learning as abstract learning systems.
It uses the presented formalism to relate the three concepts of learning in terms of composition, hierarchy, and structural homomorphism.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transfer learning, multi-task learning, and meta-learning are well-studied
topics concerned with the generalization of knowledge across learning tasks and
are closely related to general intelligence. But, the formal, general systems
differences between them are underexplored in the literature. This lack of
systems-level formalism leads to difficulties in coordinating related,
inter-disciplinary engineering efforts. This manuscript formalizes transfer
learning, multi-task learning, and meta-learning as abstract learning systems,
consistent with the formal-minimalist abstract systems theory of Mesarovic and
Takahara. Moreover, it uses the presented formalism to relate the three
concepts of learning in terms of composition, hierarchy, and structural
homomorphism. Findings are readily depicted in terms of input-output systems,
highlighting the ease of delineating formal, general systems differences
between transfer, multi-task, and meta-learning.
Related papers
- Advances and Challenges in Meta-Learning: A Technical Review [7.149235250835041]
Meta-learning empowers learning systems with the ability to acquire knowledge from multiple tasks.
This review emphasizes its importance in real-world applications where data may be scarce or expensive to obtain.
arXiv Detail & Related papers (2023-07-10T17:32:15Z) - Learning with Limited Samples -- Meta-Learning and Applications to
Communication Systems [46.760568562468606]
Few-shot meta-learning optimize learning algorithms that can efficiently adapt to new tasks quickly.
This review monograph provides an introduction to meta-learning by covering principles, algorithms, theory, and engineering applications.
arXiv Detail & Related papers (2022-10-03T17:15:36Z) - Foundations and Recent Trends in Multimodal Machine Learning:
Principles, Challenges, and Open Questions [68.6358773622615]
This paper provides an overview of the computational and theoretical foundations of multimodal machine learning.
We propose a taxonomy of 6 core technical challenges: representation, alignment, reasoning, generation, transference, and quantification.
Recent technical achievements will be presented through the lens of this taxonomy, allowing researchers to understand the similarities and differences across new approaches.
arXiv Detail & Related papers (2022-09-07T19:21:19Z) - Multimodality in Meta-Learning: A Comprehensive Survey [34.69292359136745]
This survey provides a comprehensive overview of the multimodality-based meta-learning landscape.
We first formalize the definition of meta-learning and multimodality, along with the research challenges in this growing field.
We then propose a new taxonomy to systematically discuss typical meta-learning algorithms combined with multimodal tasks.
arXiv Detail & Related papers (2021-09-28T09:16:12Z) - Panoramic Learning with A Standardized Machine Learning Formalism [116.34627789412102]
This paper presents a standardized equation of the learning objective, that offers a unifying understanding of diverse ML algorithms.
It also provides guidance for mechanic design of new ML solutions, and serves as a promising vehicle towards panoramic learning with all experiences.
arXiv Detail & Related papers (2021-08-17T17:44:38Z) - A Systems Theory of Transfer Learning [3.5281112495479245]
We use Mesarovician systems theory to define transfer learning as a relation on sets.
We then characterize the general nature of transfer learning as a mathematical construct.
Despite its formalism, our framework avoids the detailed mathematics of learning theory or machine learning solution methods.
arXiv Detail & Related papers (2021-07-02T17:25:42Z) - Online Structured Meta-learning [137.48138166279313]
Current online meta-learning algorithms are limited to learn a globally-shared meta-learner.
We propose an online structured meta-learning (OSML) framework to overcome this limitation.
Experiments on three datasets demonstrate the effectiveness and interpretability of our proposed framework.
arXiv Detail & Related papers (2020-10-22T09:10:31Z) - Concept Learners for Few-Shot Learning [76.08585517480807]
We propose COMET, a meta-learning method that improves generalization ability by learning to learn along human-interpretable concept dimensions.
We evaluate our model on few-shot tasks from diverse domains, including fine-grained image classification, document categorization and cell type annotation.
arXiv Detail & Related papers (2020-07-14T22:04:17Z) - Self-organizing Democratized Learning: Towards Large-scale Distributed
Learning Systems [71.14339738190202]
democratized learning (Dem-AI) lays out a holistic philosophy with underlying principles for building large-scale distributed and democratized machine learning systems.
Inspired by Dem-AI philosophy, a novel distributed learning approach is proposed in this paper.
The proposed algorithms demonstrate better results in the generalization performance of learning models in agents compared to the conventional FL algorithms.
arXiv Detail & Related papers (2020-07-07T08:34:48Z) - Automated Relational Meta-learning [95.02216511235191]
We propose an automated relational meta-learning framework that automatically extracts the cross-task relations and constructs the meta-knowledge graph.
We conduct extensive experiments on 2D toy regression and few-shot image classification and the results demonstrate the superiority of ARML over state-of-the-art baselines.
arXiv Detail & Related papers (2020-01-03T07:02:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.