Extending Whisper with prompt tuning to target-speaker ASR
- URL: http://arxiv.org/abs/2312.08079v2
- Date: Thu, 11 Jan 2024 07:11:18 GMT
- Title: Extending Whisper with prompt tuning to target-speaker ASR
- Authors: Hao Ma, Zhiyuan Peng, Mingjie Shao, Jing Li, Ju Liu
- Abstract summary: Target-speaker automatic speech recognition (ASR) aims to transcribe the desired speech of a target speaker from overlapped utterances.
Most of the existing target-speaker ASR (TS-ASR) methods involve either training from scratch or fully fine-tuning a pre-trained model.
This work leverages prompt tuning, a parameter-efficient fine-tuning approach, to extend Whisper, a large-scale single-talker ASR model, to TS-ASR.
- Score: 18.31992429200396
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Target-speaker automatic speech recognition (ASR) aims to transcribe the
desired speech of a target speaker from multi-talker overlapped utterances.
Most of the existing target-speaker ASR (TS-ASR) methods involve either
training from scratch or fully fine-tuning a pre-trained model, leading to
significant training costs and becoming inapplicable to large foundation
models. This work leverages prompt tuning, a parameter-efficient fine-tuning
approach, to extend Whisper, a large-scale single-talker ASR model, to TS-ASR.
Variants of prompt tuning approaches along with their configurations are
explored and optimized for TS-ASR.Experimental results show that prompt tuning
can achieve performance comparable to state-of-the-art full training approaches
while only requiring about 1\% of task-specific model parameters. Notably, the
original Whisper's features, such as inverse text normalization and timestamp
tagging, are retained in target-speaker ASR, keeping the generated
transcriptions natural and informative.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.