论文标题

简化的Tinybert:文档检索的知识蒸馏

Simplified TinyBERT: Knowledge Distillation for Document Retrieval

论文作者

Chen, Xuanang, He, Ben, Hui, Kai, Sun, Le, Sun, Yingfei

论文摘要

尽管将BERT模型用于文档排名具有有效性,但此类方法的高计算成本限制了其使用。为此,本文首先从经验上研究了两个知识蒸馏模型对文档排名任务的有效性。此外,除了最近提出的Tinybert模型之外,还提出了两个简化。对两个不同且广泛使用的基准测试的评估表明,简化的Tinybert不仅可以提高Tinybert,而且在提供15 $ \ times $速度时,tinybert不仅可以提高Bert-base。

Despite the effectiveness of utilizing the BERT model for document ranking, the high computational cost of such approaches limits their uses. To this end, this paper first empirically investigates the effectiveness of two knowledge distillation models on the document ranking task. In addition, on top of the recently proposed TinyBERT model, two simplifications are proposed. Evaluations on two different and widely-used benchmarks demonstrate that Simplified TinyBERT with the proposed simplifications not only boosts TinyBERT, but also significantly outperforms BERT-Base when providing 15$\times$ speedup.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源