Don't Parse, Generate! A Sequence to Sequence Architecture for
Task-Oriented Semantic Parsing
- URL: http://arxiv.org/abs/2001.11458v1
- Date: Thu, 30 Jan 2020 17:11:00 GMT
- Title: Don't Parse, Generate! A Sequence to Sequence Architecture for
Task-Oriented Semantic Parsing
- Authors: Subendhu Rongali (University of Massachusetts Amherst), Luca Soldaini
(Amazon Alexa Search), Emilio Monti (Amazon Alexa), Wael Hamza (Amazon Alexa
AI)
- Abstract summary: Virtual assistants such as Amazon Alexa, Apple Siri, and Google Assistant often rely on a semantic parsing component to understand which action(s) to execute for an utterance spoken by its users.
We propose a unified architecture based on Sequence to Sequence models and Pointer Generator Network to handle both simple and complex queries.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Virtual assistants such as Amazon Alexa, Apple Siri, and Google Assistant
often rely on a semantic parsing component to understand which action(s) to
execute for an utterance spoken by its users. Traditionally, rule-based or
statistical slot-filling systems have been used to parse "simple" queries; that
is, queries that contain a single action and can be decomposed into a set of
non-overlapping entities. More recently, shift-reduce parsers have been
proposed to process more complex utterances. These methods, while powerful,
impose specific limitations on the type of queries that can be parsed; namely,
they require a query to be representable as a parse tree.
In this work, we propose a unified architecture based on Sequence to Sequence
models and Pointer Generator Network to handle both simple and complex queries.
Unlike other works, our approach does not impose any restriction on the
semantic parse schema. Furthermore, experiments show that it achieves state of
the art performance on three publicly available datasets (ATIS, SNIPS, Facebook
TOP), relatively improving between 3.3% and 7.7% in exact match accuracy over
previous systems. Finally, we show the effectiveness of our approach on two
internal datasets.
Related papers
- Semantic Parsing for Conversational Question Answering over Knowledge
Graphs [63.939700311269156]
We develop a dataset where user questions are annotated with Sparql parses and system answers correspond to execution results thereof.
We present two different semantic parsing approaches and highlight the challenges of the task.
Our dataset and models are released at https://github.com/Edinburgh/SPICE.
arXiv Detail & Related papers (2023-01-28T14:45:11Z) - PIZZA: A new benchmark for complex end-to-end task-oriented parsing [3.5106870325869886]
This paper introduces a new dataset for parsing pizza and drink orders, whose semantics cannot be captured by flat slots and intents.
We perform an evaluation of deep-learning techniques for task-oriented parsing on this dataset, including different flavors of seq2seqNGs.
arXiv Detail & Related papers (2022-12-01T04:20:07Z) - Improving Text-to-SQL Semantic Parsing with Fine-grained Query
Understanding [84.04706075621013]
We present a general-purpose, modular neural semantic parsing framework based on token-level fine-grained query understanding.
Our framework consists of three modules: named entity recognizer (NER), neural entity linker (NEL) and neural entity linker (NSP)
arXiv Detail & Related papers (2022-09-28T21:00:30Z) - Proton: Probing Schema Linking Information from Pre-trained Language
Models for Text-to-SQL Parsing [66.55478402233399]
We propose a framework to elicit relational structures via a probing procedure based on Poincar'e distance metric.
Compared with commonly-used rule-based methods for schema linking, we found that probing relations can robustly capture semantic correspondences.
Our framework sets new state-of-the-art performance on three benchmarks.
arXiv Detail & Related papers (2022-06-28T14:05:25Z) - Compositional Task-Oriented Parsing as Abstractive Question Answering [25.682923914685063]
Task-oriented parsing aims to convert natural language into machine-readable representations of specific tasks, such as setting an alarm.
A popular approach to TOP is to apply seq2seq models to generate linearized parse trees.
A more recent line of work argues that pretrained seq2seq models are better at generating outputs that are themselves natural language, so they replace linearized parse trees with canonical natural-language paraphrases.
arXiv Detail & Related papers (2022-05-04T14:01:08Z) - Parallel Instance Query Network for Named Entity Recognition [73.30174490672647]
Named entity recognition (NER) is a fundamental task in natural language processing.
Recent works treat named entity recognition as a reading comprehension task, constructing type-specific queries manually to extract entities.
We propose Parallel Instance Query Network (PIQN), which sets up global and learnable instance queries to extract entities in a parallel manner.
arXiv Detail & Related papers (2022-03-20T13:01:25Z) - Rethinking End-to-End Evaluation of Decomposable Tasks: A Case Study on
Spoken Language Understanding [101.24748444126982]
Decomposable tasks are complex and comprise of a hierarchy of sub-tasks.
Existing benchmarks, however, typically hold out examples for only the surface-level sub-task.
We propose a framework to construct robust test sets using coordinate ascent over sub-task specific utility functions.
arXiv Detail & Related papers (2021-06-29T02:53:59Z) - X2Parser: Cross-Lingual and Cross-Domain Framework for Task-Oriented
Compositional Semantic Parsing [51.81533991497547]
Task-oriented compositional semantic parsing (TCSP) handles complex nested user queries.
We present X2 compared a transferable Cross-lingual and Cross-domain for TCSP.
We propose to predict flattened intents and slots representations separately and cast both prediction tasks into sequence labeling problems.
arXiv Detail & Related papers (2021-06-07T16:40:05Z) - Don't Parse, Insert: Multilingual Semantic Parsing with Insertion Based
Decoding [10.002379593718471]
A successful parse transforms an input utterance to an action that is easily understood by the system.
For complex parsing tasks, the state-of-the-art method is based on autoregressive sequence to sequence models to generate the parse directly.
arXiv Detail & Related papers (2020-10-08T01:18:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.