HIE-SQL: History Information Enhanced Network for Context-Dependent
Text-to-SQL Semantic Parsing
- URL: http://arxiv.org/abs/2203.07376v1
- Date: Mon, 14 Mar 2022 11:58:37 GMT
- Title: HIE-SQL: History Information Enhanced Network for Context-Dependent
Text-to-SQL Semantic Parsing
- Authors: Yanzhao Zheng, Haibin Wang, Baohua Dong, Xingjun Wang, Changshan Li
- Abstract summary: We propose a History Information Enhanced text-to-the-art model (HIE-) to exploit context-dependence information from both history utterances and the last predictedsql query.
We show our methods improve the performance of HIE- by a significant margin, which achieves new state-of-the-art results on the two context-dependent text-to-the-art benchmarks.
- Score: 1.343950231082215
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, context-dependent text-to-SQL semantic parsing which translates
natural language into SQL in an interaction process has attracted a lot of
attention. Previous works leverage context-dependence information either from
interaction history utterances or the previous predicted SQL queries but fail
in taking advantage of both since of the mismatch between natural language and
logic-form SQL. In this work, we propose a History Information Enhanced
text-to-SQL model (HIE-SQL) to exploit context-dependence information from both
history utterances and the last predicted SQL query. In view of the mismatch,
we treat natural language and SQL as two modalities and propose a bimodal
pre-trained model to bridge the gap between them. Besides, we design a
schema-linking graph to enhance connections from utterances and the SQL query
to the database schema. We show our history information enhanced methods
improve the performance of HIE-SQL by a significant margin, which achieves new
state-of-the-art results on the two context-dependent text-to-SQL benchmarks,
the SparC and CoSQL datasets, at the writing time.
Related papers
- SQLPrompt: In-Context Text-to-SQL with Minimal Labeled Data [54.69489315952524]
"Prompt" is designed to improve the few-shot prompting capabilities of Text-to-LLMs.
"Prompt" outperforms previous approaches for in-context learning with few labeled data by a large margin.
We show that emphPrompt outperforms previous approaches for in-context learning with few labeled data by a large margin.
arXiv Detail & Related papers (2023-11-06T05:24:06Z) - SQL-PaLM: Improved Large Language Model Adaptation for Text-to-SQL (extended) [53.95151604061761]
This paper introduces the framework for enhancing Text-to- filtering using large language models (LLMs)
With few-shot prompting, we explore the effectiveness of consistency decoding with execution-based error analyses.
With instruction fine-tuning, we delve deep in understanding the critical paradigms that influence the performance of tuned LLMs.
arXiv Detail & Related papers (2023-05-26T21:39:05Z) - STAR: SQL Guided Pre-Training for Context-dependent Text-to-SQL Parsing [64.80483736666123]
We propose a novel pre-training framework STAR for context-dependent text-to- parsing.
In addition, we construct a large-scale context-dependent text-to-the-art conversation corpus to pre-train STAR.
Extensive experiments show that STAR achieves new state-of-the-art performance on two downstream benchmarks.
arXiv Detail & Related papers (2022-10-21T11:30:07Z) - A Survey on Text-to-SQL Parsing: Concepts, Methods, and Future
Directions [102.8606542189429]
The goal of text-to-corpora parsing is to convert a natural language (NL) question to its corresponding structured query language () based on the evidences provided by databases.
Deep neural networks have significantly advanced this task by neural generation models, which automatically learn a mapping function from an input NL question to an output query.
arXiv Detail & Related papers (2022-08-29T14:24:13Z) - CQR-SQL: Conversational Question Reformulation Enhanced
Context-Dependent Text-to-SQL Parsers [35.36754559708944]
Context-dependent text-to-reference is the task of translating multi-turn questions into database-related queries.
In this paper, we propose CQR-couple, which uses auxiliary Conversational Question Reformulation (CQR) learning to explicitly exploit and decouple contextual dependency forsql parsing.
At the time of writing, our CQR-couple achieves new state-of-the-art results on two context-dependent benchmarks SParC and Co.
arXiv Detail & Related papers (2022-05-16T13:52:42Z) - S$^2$SQL: Injecting Syntax to Question-Schema Interaction Graph Encoder
for Text-to-SQL Parsers [66.78665327694625]
We propose S$2$, injecting Syntax to question- encoder graph for Text-to- relational parsing.
We also employ the decoupling constraint to induce diverse edge embedding, which further improves the network's performance.
Experiments on the Spider and robustness setting Spider-Syn demonstrate that the proposed approach outperforms all existing methods when pre-training models are used.
arXiv Detail & Related papers (2022-03-14T09:49:15Z) - Tracking Interaction States for Multi-Turn Text-to-SQL Semantic Parsing [44.0348697408427]
The task of multi-turn text-to- semantic parsing aims to translate natural language utterances in an interaction intosql queries.
A graph relational network and a non-linear layer are designed to update the representations of these two states respectively.
Experimental results on the challenging Co dataset demonstrate the effectiveness of our proposed method.
arXiv Detail & Related papers (2020-12-09T11:59:58Z) - IGSQL: Database Schema Interaction Graph Based Neural Model for
Context-Dependent Text-to-SQL Generation [61.09660709356527]
We propose a database schema interaction graph encoder to utilize historicalal information of database schema items.
We evaluate our model on the benchmark SParC and Co datasets.
arXiv Detail & Related papers (2020-11-11T12:56:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.