PSM-SQL: Progressive Schema Learning with Multi-granularity Semantics for Text-to-SQL
- URL: http://arxiv.org/abs/2502.05237v1
- Date: Fri, 07 Feb 2025 08:31:57 GMT
- Title: PSM-SQL: Progressive Schema Learning with Multi-granularity Semantics for Text-to-SQL
- Authors: Zhuopan Yang, Yuanzhen Xie, Ruichao Zhong, Yunzhi Tan, Enjie Liu, Zhenguo Yang, Mochi Gao, Bo Hu, Zang Li,
- Abstract summary: It is challenging to convert tasks due to the vast number of database schemas with redundancy.<n>We propose a progressive schema linking with multi-granularity semantics (PSM-)<n>PSM- learns the schema semantics at the column, table, and database levels.
- Score: 8.416319689644556
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: It is challenging to convert natural language (NL) questions into executable structured query language (SQL) queries for text-to-SQL tasks due to the vast number of database schemas with redundancy, which interferes with semantic learning, and the domain shift between NL and SQL. Existing works for schema linking focus on the table level and perform it once, ignoring the multi-granularity semantics and chainable cyclicity of schemas. In this paper, we propose a progressive schema linking with multi-granularity semantics (PSM-SQL) framework to reduce the redundant database schemas for text-to-SQL. Using the multi-granularity schema linking (MSL) module, PSM-SQL learns the schema semantics at the column, table, and database levels. More specifically, a triplet loss is used at the column level to learn embeddings, while fine-tuning LLMs is employed at the database level for schema reasoning. MSL employs classifier and similarity scores to model schema interactions for schema linking at the table level. In particular, PSM-SQL adopts a chain loop strategy to reduce the task difficulty of schema linking by continuously reducing the number of redundant schemas. Experiments conducted on text-to-SQL datasets show that the proposed PSM-SQL is 1-3 percentage points higher than the existing methods.
Related papers
- LinkAlign: Scalable Schema Linking for Real-World Large-Scale Multi-Database Text-to-SQL [14.677024710675838]
LinkAlign is a novel framework that can effectively adapt existing baselines to real-world environments.
We evaluate our method performance on the SPIDER and BIRD benchmarks.
LinkAlign ranks highest among models excluding those using long chain-of-thought reasoning LLMs.
arXiv Detail & Related papers (2025-03-24T11:53:06Z) - Extractive Schema Linking for Text-to-SQL [17.757832644216446]
Text-to-one is emerging as a practical interface for real world databases.<n>We introduce a new approach to adapt decoder-only LLMs to schema linking.
arXiv Detail & Related papers (2025-01-23T19:57:08Z) - V-SQL: A View-based Two-stage Text-to-SQL Framework [0.9719868595277401]
Text-to-coupling methods based on large language models (LLMs) have garnered significant attention.
The core of mainstream text-to-coupling frameworks is schema linking, which aligns user queries with relevant tables and columns in the database.
Previous methods focused on schema linking while to enhance LLMs' understanding of database schema.
arXiv Detail & Related papers (2024-12-17T02:27:50Z) - RSL-SQL: Robust Schema Linking in Text-to-SQL Generation [51.00761167842468]
We propose a novel framework called RSL- that combines bidirectional schema linking, contextual information augmentation, binary selection strategy, and multi-turn self-correction.
benchmarks demonstrate that our approach achieves SOTA execution accuracy among open-source solutions, with 67.2% on BIRD and 87.9% on GPT-4ocorrection.
Our approach outperforms a series of GPT-4 based Text-to-Seek systems when adopting DeepSeek (much cheaper) with same intact prompts.
arXiv Detail & Related papers (2024-10-31T16:22:26Z) - The Death of Schema Linking? Text-to-SQL in the Age of Well-Reasoned Language Models [0.9149661171430259]
We revisit schema linking when using the latest generation of large language models (LLMs)
We find empirically that newer models are adept at utilizing relevant schema elements during generation even in the presence of large numbers of irrelevant ones.
Instead of filtering contextual information, we highlight techniques such as augmentation, selection, and correction, and adopt them to improve the accuracy of our Text-to-BIRD pipeline.
arXiv Detail & Related papers (2024-08-14T17:59:04Z) - RB-SQL: A Retrieval-based LLM Framework for Text-to-SQL [48.516004807486745]
Large language models (LLMs) with in-context learning have significantly improved the performance of text-to- task.
We propose RB-, a novel retrieval-based framework for in-context prompt engineering.
Experiment results demonstrate that our model achieves better performance than several competitive baselines on public datasets BIRD and Spider.
arXiv Detail & Related papers (2024-07-11T08:19:58Z) - Schema-Aware Multi-Task Learning for Complex Text-to-SQL [4.913409359995421]
We present a schema-aware multi-task learning framework (named MT) for complicatedsql queries.
Specifically, we design a schema linking discriminator module to distinguish the valid question-schema linkings.
On the decoder side, we define 6-type relationships to describe the connections between tables and columns.
arXiv Detail & Related papers (2024-03-09T01:13:37Z) - TAP4LLM: Table Provider on Sampling, Augmenting, and Packing Semi-structured Data for Large Language Model Reasoning [55.33939289989238]
We propose TAP4LLM as a versatile pre-processor suite for leveraging large language models (LLMs) in table-based tasks effectively.
It covers several distinct components: (1) table sampling to decompose large tables into manageable sub-tables based on query semantics, (2) table augmentation to enhance tables with additional knowledge from external sources or models, and (3) table packing & serialization to convert tables into various formats suitable for LLMs' understanding.
arXiv Detail & Related papers (2023-12-14T15:37:04Z) - DBCopilot: Scaling Natural Language Querying to Massive Databases [47.009638761948466]
Existing methods face scalability challenges when dealing with massive, dynamically changing databases.
This paper introduces DBCopilot, a framework that employs a compact and flexible copilot model for routing across massive databases.
arXiv Detail & Related papers (2023-12-06T12:37:28Z) - Semantic Enhanced Text-to-SQL Parsing via Iteratively Learning Schema
Linking Graph [6.13728903057727]
The generalizability to new databases is of vital importance to Text-to- systems which aim to parse human utterances intosql statements.
In this paper, we propose a framework named IS ESL to iteratively build a enhanced semantic schema-linking graph between question tokens and database schemas.
Extensive experiments on three benchmarks demonstrate that IS ESL could consistently outperform the baselines and further investigations show its generalizability and robustness.
arXiv Detail & Related papers (2022-08-08T03:59:33Z) - Proton: Probing Schema Linking Information from Pre-trained Language
Models for Text-to-SQL Parsing [66.55478402233399]
We propose a framework to elicit relational structures via a probing procedure based on Poincar'e distance metric.
Compared with commonly-used rule-based methods for schema linking, we found that probing relations can robustly capture semantic correspondences.
Our framework sets new state-of-the-art performance on three benchmarks.
arXiv Detail & Related papers (2022-06-28T14:05:25Z) - UniSAr: A Unified Structure-Aware Autoregressive Language Model for
Text-to-SQL [48.21638676148253]
We present UniSAr (Unified Structure-Aware Autoregressive Language Model), which benefits from using an off-the-shelf language model.
Specifically, UniSAr extends existing autoregressive models to incorporate three non-invasive extensions to make them structure-aware.
arXiv Detail & Related papers (2022-03-15T11:02:55Z) - GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing [117.98107557103877]
We present GraPPa, an effective pre-training approach for table semantic parsing.
We construct synthetic question-pairs over high-free tables via a synchronous context-free grammar.
To maintain the model's ability to represent real-world data, we also include masked language modeling.
arXiv Detail & Related papers (2020-09-29T08:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.