Download PDFOpen PDF in browser

A Deep Joint Model of Multi-Scale Intent-Slots Interaction with Second-Order Gate for SLU.

EasyChair Preprint no. 11094

12 pagesDate: October 13, 2023

Abstract

Slot filling and intent detection are crucial tasks of Spoken Language Understanding (SLU). However, most existing joint models establish shallow connections between intent and slot by sharing parameters, which cannot fully utilize their rich interaction information. Meanwhile, the character and word fusion methods used in the Chinese SLU simply combines the initial information without appropriate guidance, making it easy to introduce a large amount of noisy information. In this paper, we propose a deep joint model of Multi-Scale intent-slots Interaction with Second-Order Gate for Chinese SLU (MSIM-SOG). The model consists of two main modules: (1) the Multi-Scale intent-slots Interaction Module (MSIM), which enables cyclic updating the multi-scale information to achieve deep bi-directional interaction of intent and slots; (2) the Second-Order Gate Module (SOG), which controls the propagation of valuable information through the gate with second-order weights, reduces the noise information of fusion, accelerates model convergence, and alleviates model overfitting. Experiments on two public datasets demonstrate that our model outperforms the baseline and achieves state-of-the-art performance compared to previous models.

Keyphrases: intent detection, Multi-Scale intent-slots Interaction Module (MSIM), Second-Order Gate (SOG), slot filling

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:11094,
  author = {Qingpeng Wen and Bi Zeng and Pengfei Wei and Huiting Hu},
  title = {A Deep Joint Model of Multi-Scale Intent-Slots Interaction with Second-Order Gate for SLU.},
  howpublished = {EasyChair Preprint no. 11094},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser