论文标题

自然语言处理中持续的终身学习:一项调查

Continual Lifelong Learning in Natural Language Processing: A Survey

论文作者

Biesialska, Magdalena, Biesialska, Katarzyna, Costa-jussà, Marta R.

论文摘要

持续学习(CL)旨在使信息系统能够跨时间从连续的数据流中学习。但是,现有的深度学习体系结构很难学习一项新任务,而不在很大程度上忘记了以前获得的知识。此外,CL对于语言学习特别具有挑战性,因为自然语言是模棱两可的:它是离散的,构图的,其含义与上下文有关。在这项工作中,我们通过各种NLP任务的镜头来查看CL的问题。我们的调查讨论了CL和神经网络模型中应用的当前方法的主要挑战。我们还对NLP中现有的CL评估方法和数据集进行了批判性审查。最后,我们介绍了未来研究方向的前景。

Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源