Noise Robust Named Entity Understanding for Voice Assistants
- URL: http://arxiv.org/abs/2005.14408v3
- Date: Tue, 10 Aug 2021 17:39:55 GMT
- Title: Noise Robust Named Entity Understanding for Voice Assistants
- Authors: Deepak Muralidharan, Joel Ruben Antony Moniz, Sida Gao, Xiao Yang,
Justine Kao, Stephen Pulman, Atish Kothari, Ray Shen, Yinying Pan, Vivek
Kaul, Mubarak Seyed Ibrahim, Gang Xiang, Nan Dun, Yidan Zhou, Andy O, Yuan
Zhang, Pooja Chitkara, Xuan Wang, Alkesh Patel, Kushal Tayal, Roger Zheng,
Peter Grasch, Jason D. Williams, Lin Li
- Abstract summary: We show that our proposed framework improves NER accuracy by up to 3.13% and EL accuracy by up to 3.6% in F1 score.
The features used also lead to better accuracies in other natural language understanding tasks, such as domain classification and semantic parsing.
- Score: 14.193603900541005
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Named Entity Recognition (NER) and Entity Linking (EL) play an essential role
in voice assistant interaction, but are challenging due to the special
difficulties associated with spoken user queries. In this paper, we propose a
novel architecture that jointly solves the NER and EL tasks by combining them
in a joint reranking module. We show that our proposed framework improves NER
accuracy by up to 3.13% and EL accuracy by up to 3.6% in F1 score. The features
used also lead to better accuracies in other natural language understanding
tasks, such as domain classification and semantic parsing.
Related papers
- Learning Robust Named Entity Recognizers From Noisy Data With Retrieval Augmentation [67.89838237013078]
Named entity recognition (NER) models often struggle with noisy inputs.
We propose a more realistic setting in which only noisy text and its NER labels are available.
We employ a multi-view training framework that improves robust NER without retrieving text during inference.
arXiv Detail & Related papers (2024-07-26T07:30:41Z) - In-Context Learning for Few-Shot Nested Named Entity Recognition [53.55310639969833]
We introduce an effective and innovative ICL framework for the setting of few-shot nested NER.
We improve the ICL prompt by devising a novel example demonstration selection mechanism, EnDe retriever.
In EnDe retriever, we employ contrastive learning to perform three types of representation learning, in terms of semantic similarity, boundary similarity, and label similarity.
arXiv Detail & Related papers (2024-02-02T06:57:53Z) - Named Entity Recognition via Machine Reading Comprehension: A Multi-Task
Learning Approach [50.12455129619845]
Named Entity Recognition (NER) aims to extract and classify entity mentions in the text into pre-defined types.
We propose to incorporate the label dependencies among entity types into a multi-task learning framework for better MRC-based NER.
arXiv Detail & Related papers (2023-09-20T03:15:05Z) - IXA/Cogcomp at SemEval-2023 Task 2: Context-enriched Multilingual Named
Entity Recognition using Knowledge Bases [53.054598423181844]
We present a novel NER cascade approach comprising three steps.
We empirically demonstrate the significance of external knowledge bases in accurately classifying fine-grained and emerging entities.
Our system exhibits robust performance in the MultiCoNER2 shared task, even in the low-resource language setting.
arXiv Detail & Related papers (2023-04-20T20:30:34Z) - Dynamic Named Entity Recognition [5.9401550252715865]
We introduce a new task: Dynamic Named Entity Recognition (DNER)
DNER provides a framework to better evaluate the ability of algorithms to extract entities by exploiting the context.
We evaluate baseline models and present experiments reflecting issues and research axes related to this novel task.
arXiv Detail & Related papers (2023-02-16T15:50:02Z) - Empirical Study of Named Entity Recognition Performance Using
Distribution-aware Word Embedding [15.955385058787348]
We develop a distribution-aware word embedding and implement three different methods to make use of the distribution information in a NER framework.
The performance of NER will be improved if the word specificity is incorporated into existing NER methods.
arXiv Detail & Related papers (2021-09-03T17:28:04Z) - DEXTER: Deep Encoding of External Knowledge for Named Entity Recognition
in Virtual Assistants [10.500933545429202]
In intelligent voice assistants, where NER is an important component, input to NER may be noisy because of user or speech recognition error.
We describe a NER system intended to address these problems.
We show that this technique improves related tasks, such as semantic parsing, with an improvement of up to 5% in error rate.
arXiv Detail & Related papers (2021-08-15T00:14:47Z) - Probing Linguistic Features of Sentence-Level Representations in Neural
Relation Extraction [80.38130122127882]
We introduce 14 probing tasks targeting linguistic properties relevant to neural relation extraction (RE)
We use them to study representations learned by more than 40 different encoder architecture and linguistic feature combinations trained on two datasets.
We find that the bias induced by the architecture and the inclusion of linguistic features are clearly expressed in the probing task performance.
arXiv Detail & Related papers (2020-04-17T09:17:40Z) - Improving Readability for Automatic Speech Recognition Transcription [50.86019112545596]
We propose a novel NLP task called ASR post-processing for readability (APR)
APR aims to transform the noisy ASR output into a readable text for humans and downstream tasks while maintaining the semantic meaning of the speaker.
We compare fine-tuned models based on several open-sourced and adapted pre-trained models with the traditional pipeline method.
arXiv Detail & Related papers (2020-04-09T09:26:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.