论文标题
转移到数据:一种新的持续学习方法,具有深入的CNN,图像级识别的应用
Move-to-Data: A new Continual Learning approach with Deep CNNs, Application for image-class recognition
论文作者
论文摘要
在许多现实生活中应用监督学习方法的任务中,所有培训数据都不同时获得。这些示例是终身图像分类或在仪器与环境的互动过程中对环境对象的识别,丰富了一个带有更多图像的在线数据库。有必要在“训练记录阶段”中预先培训模型,然后将其调整为新的即将到来的数据。这是增量/持续学习方法的任务。在这些方法中要解决的不同问题中,例如在模型中引入新类别,将现有类别完善为子类别,并扩展训练有素的分类器,...我们专注于针对现有类别的新培训数据调整预训练模型的问题。我们在神经元网络的末尾提出了一个快速的持续学习层。在OpenSource CIFAR基准数据集上说明了获得的结果。所提出的方案产生的性能与重新培训相似,但计算成本却大大降低。
In many real-life tasks of application of supervised learning approaches, all the training data are not available at the same time. The examples are lifelong image classification or recognition of environmental objects during interaction of instrumented persons with their environment, enrichment of an online-database with more images. It is necessary to pre-train the model at a "training recording phase" and then adjust it to the new coming data. This is the task of incremental/continual learning approaches. Amongst different problems to be solved by these approaches such as introduction of new categories in the model, refining existing categories to sub-categories and extending trained classifiers over them, ... we focus on the problem of adjusting pre-trained model with new additional training data for existing categories. We propose a fast continual learning layer at the end of the neuronal network. Obtained results are illustrated on the opensource CIFAR benchmark dataset. The proposed scheme yields similar performances as retraining but with drastically lower computational cost.