Meta Learning for Natural Language Processing: A Survey
- URL: http://arxiv.org/abs/2205.01500v1
- Date: Tue, 3 May 2022 13:58:38 GMT
- Title: Meta Learning for Natural Language Processing: A Survey
- Authors: Hung-yi Lee, Shang-Wen Li, Ngoc Thang Vu
- Abstract summary: Deep learning has been the mainstream technique in natural language processing (NLP) area.
Deep learning requires many labeled data and is less generalizable across domains.
Meta-learning is an arising field in machine learning studying approaches to learn better algorithms.
- Score: 88.58260839196019
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning has been the mainstream technique in natural language
processing (NLP) area. However, the techniques require many labeled data and
are less generalizable across domains. Meta-learning is an arising field in
machine learning studying approaches to learn better learning algorithms.
Approaches aim at improving algorithms in various aspects, including data
efficiency and generalizability. Efficacy of approaches has been shown in many
NLP tasks, but there is no systematic survey of these approaches in NLP, which
hinders more researchers from joining the field. Our goal with this survey
paper is to offer researchers pointers to relevant meta-learning works in NLP
and attract more attention from the NLP community to drive future innovation.
This paper first introduces the general concepts of meta-learning and the
common approaches. Then we summarize task construction settings and application
of meta-learning for various NLP problems and review the development of
meta-learning in NLP community.
Related papers
- A survey of neural-network-based methods utilising comparable data for finding translation equivalents [0.0]
We present the most common approaches from NLP that endeavour to automatically induce one of the essential dictionary components.
We analyse them from a lexicographic perspective since their viewpoints are crucial for improving the described methods.
This survey encourages a connection between the NLP and lexicography fields as the NLP field can benefit from lexicographic insights.
arXiv Detail & Related papers (2024-10-19T16:10:41Z) - Survey of Natural Language Processing for Education: Taxonomy, Systematic Review, and Future Trends [26.90343340881045]
We review recent advances in NLP with the focus on solving problems relevant to the education domain.
We present a taxonomy of NLP in the education domain and highlight typical NLP applications including question answering, question construction, automated assessment, and error correction.
We conclude with six promising directions for future research, including more datasets in education domain, controllable usage of LLMs, intervention of difficulty-level control, interpretable educational NLP, methods with adaptive learning, and integrated systems for education.
arXiv Detail & Related papers (2024-01-15T07:48:42Z) - Natural Language Processing for Dialects of a Language: A Survey [56.93337350526933]
State-of-the-art natural language processing (NLP) models are trained on massive training corpora, and report a superlative performance on evaluation datasets.
This survey delves into an important attribute of these datasets: the dialect of a language.
Motivated by the performance degradation of NLP models for dialectic datasets and its implications for the equity of language technologies, we survey past research in NLP for dialects in terms of datasets, and approaches.
arXiv Detail & Related papers (2024-01-11T03:04:38Z) - Surveying the Landscape of Text Summarization with Deep Learning: A
Comprehensive Review [2.4185510826808487]
Deep learning has revolutionized natural language processing (NLP) by enabling the development of models that can learn complex representations of language data.
Deep learning models for NLP typically use large amounts of data to train deep neural networks, allowing them to learn the patterns and relationships in language data.
Applying deep learning to text summarization refers to the use of deep neural networks to perform text summarization tasks.
arXiv Detail & Related papers (2023-10-13T21:24:37Z) - Beyond Good Intentions: Reporting the Research Landscape of NLP for
Social Good [115.1507728564964]
We introduce NLP4SG Papers, a scientific dataset with three associated tasks.
These tasks help identify NLP4SG papers and characterize the NLP4SG landscape.
We use state-of-the-art NLP models to address each of these tasks and use them on the entire ACL Anthology.
arXiv Detail & Related papers (2023-05-09T14:16:25Z) - Systematic Inequalities in Language Technology Performance across the
World's Languages [94.65681336393425]
We introduce a framework for estimating the global utility of language technologies.
Our analyses involve the field at large, but also more in-depth studies on both user-facing technologies and more linguistic NLP tasks.
arXiv Detail & Related papers (2021-10-13T14:03:07Z) - FedNLP: A Research Platform for Federated Learning in Natural Language
Processing [55.01246123092445]
We present the FedNLP, a research platform for federated learning in NLP.
FedNLP supports various popular task formulations in NLP such as text classification, sequence tagging, question answering, seq2seq generation, and language modeling.
Preliminary experiments with FedNLP reveal that there exists a large performance gap between learning on decentralized and centralized datasets.
arXiv Detail & Related papers (2021-04-18T11:04:49Z) - MPLP: Learning a Message Passing Learning Protocol [63.948465205530916]
We present a novel method for learning the weights of an artificial neural network - a Message Passing Learning Protocol (MPLP)
We abstract every operations occurring in ANNs as independent agents.
Each agent is responsible for ingesting incoming multidimensional messages from other agents, updating its internal state, and generating multidimensional messages to be passed on to neighbouring agents.
arXiv Detail & Related papers (2020-07-02T09:03:14Z) - Meta-Learning in Neural Networks: A Survey [4.588028371034406]
This survey describes the contemporary meta-learning landscape.
We first discuss definitions of meta-learning and position it with respect to related fields.
We then propose a new taxonomy that provides a more comprehensive breakdown of the space of meta-learning methods.
arXiv Detail & Related papers (2020-04-11T16:34:24Z) - Natural Language Processing Advancements By Deep Learning: A Survey [0.755972004983746]
This survey categorizes and addresses the different aspects and applications of NLP that have benefited from deep learning.
It covers core NLP tasks and applications and describes how deep learning methods and models advance these areas.
arXiv Detail & Related papers (2020-03-02T21:32:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.