论文标题
通过校准Dirichlet在SLU中学习未知概念的建模级别的不确定性
Modeling Token-level Uncertainty to Learn Unknown Concepts in SLU via Calibrated Dirichlet Prior RNN
论文作者
论文摘要
现代个人助理中口语理解(SLU)的一项主要任务是从称为插槽填充的话语中提取语义概念。尽管现有的插槽填充模型试图改善培训数据中未看到的新概念的提取,但实践中的性能仍然不满足。最近的研究收集了问题并回答注释的数据,以了解未知的内容,应询问什么,但由于大量的数据收集工作,但实际上不可扩展。在本文中,我们结合了基于软疗法的插槽填充神经体系结构,以在毫无疑问监督的情况下对序列不确定性进行建模。我们设计了一个Dirichlet先前的RNN,以通过将RNN模型训练的软效果层变性为高阶不确定性来建模。为了进一步增强不确定性建模鲁棒性,我们提出了一种新型的多任务训练来校准Dirichlet浓度参数。我们收集看不见的概念,从SLU基准数据集剪辑和ATIS创建两个测试数据集。在这两个和另一个现有的概念学习基准数据集上,我们表明我们的方法大大优于最先进的方法高达8.18%。我们的方法是通用的,可以应用于使用SoftMax层的任何基于RNN或变压器的插槽填充模型。
One major task of spoken language understanding (SLU) in modern personal assistants is to extract semantic concepts from an utterance, called slot filling. Although existing slot filling models attempted to improve extracting new concepts that are not seen in training data, the performance in practice is still not satisfied. Recent research collected question and answer annotated data to learn what is unknown and should be asked, yet not practically scalable due to the heavy data collection effort. In this paper, we incorporate softmax-based slot filling neural architectures to model the sequence uncertainty without question supervision. We design a Dirichlet Prior RNN to model high-order uncertainty by degenerating as softmax layer for RNN model training. To further enhance the uncertainty modeling robustness, we propose a novel multi-task training to calibrate the Dirichlet concentration parameters. We collect unseen concepts to create two test datasets from SLU benchmark datasets Snips and ATIS. On these two and another existing Concept Learning benchmark datasets, we show that our approach significantly outperforms state-of-the-art approaches by up to 8.18%. Our method is generic and can be applied to any RNN or Transformer based slot filling models with a softmax layer.