论文标题

可区分的加权有限态传感器

Differentiable Weighted Finite-State Transducers

论文作者

Hannun, Awni, Pratap, Vineel, Kahn, Jacob, Hsu, Wei-Ning

论文摘要

我们引入了一个使用加权有限态传感器(WFST)自动差异化的框架,允许它们在训练时动态使用。通过将图形与图表上的操作分开,该框架可以探索新的结构化损失函数,从而使先验知识的编码简化为学习算法。我们展示了该框架如何通过各种序列级损耗函数将过渡模型中的修剪和退缩结合在一起。我们还展示了如何将短语的潜在分解为单词片段学习。最后,为了证明可以在深层神经网络的内部使用WFST,我们提出了一个卷积的WFST层,该层将较低级别的表示形式映射到高级表示形式,并可以用作传统卷积的倒入替换。我们通过手写识别和语音识别实验来验证这些算法。

We introduce a framework for automatic differentiation with weighted finite-state transducers (WFSTs) allowing them to be used dynamically at training time. Through the separation of graphs from operations on graphs, this framework enables the exploration of new structured loss functions which in turn eases the encoding of prior knowledge into learning algorithms. We show how the framework can combine pruning and back-off in transition models with various sequence-level loss functions. We also show how to learn over the latent decomposition of phrases into word pieces. Finally, to demonstrate that WFSTs can be used in the interior of a deep neural network, we propose a convolutional WFST layer which maps lower-level representations to higher-level representations and can be used as a drop-in replacement for a traditional convolution. We validate these algorithms with experiments in handwriting recognition and speech recognition.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源