Improving Context Modeling in Neural Topic Segmentation
- URL: http://arxiv.org/abs/2010.03138v1
- Date: Wed, 7 Oct 2020 03:40:49 GMT
- Title: Improving Context Modeling in Neural Topic Segmentation
- Authors: Linzi Xing, Brad Hackinen, Giuseppe Carenini, Francesco Trebbi
- Abstract summary: We enhance a segmenter based on a hierarchical attention BiLSTM network to better model context.
Our optimized segmenter outperforms SOTA approaches when trained and tested on three datasets.
- Score: 18.92944038749279
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Topic segmentation is critical in key NLP tasks and recent works favor highly
effective neural supervised approaches. However, current neural solutions are
arguably limited in how they model context. In this paper, we enhance a
segmenter based on a hierarchical attention BiLSTM network to better model
context, by adding a coherence-related auxiliary task and restricted
self-attention. Our optimized segmenter outperforms SOTA approaches when
trained and tested on three datasets. We also the robustness of our proposed
model in domain transfer setting by training a model on a large-scale dataset
and testing it on four challenging real-world benchmarks. Furthermore, we apply
our proposed strategy to two other languages (German and Chinese), and show its
effectiveness in multilingual scenarios.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.