论文标题
重新思考几乎没有开放式学习的几声假设
Rethinking Few-Shot Class-Incremental Learning with Open-Set Hypothesis in Hyperbolic Geometry
论文作者
论文摘要
几乎没有类似的课堂学习(FSCIL)旨在通过避免过度拟合和灾难性遗忘的过度舒适和灾难性忘记,旨在从一些标记的样本中逐步学习新颖的课程。 FSCIL的当前协议是通过模仿一般类知识学习设置来构建的,而由于不同的数据配置,即新颖的类都在有限的数据状态下,因此并不完全合适。在本文中,我们通过保留第一个会话的可能性来重新考虑FSCIL与开放集假设的配置。为了为模型分配更好的近距离和开放设定识别的表现,双曲线相互学习模块(Hyper-RPL)建立在与双曲神经网络的相互点学习(RPL)上。此外,为了从有限的标记数据中学习新颖的类别,我们将双曲线度量学习(超金属)模块纳入基于蒸馏的框架中,以减轻过度拟合的问题,并更好地处理保存旧知识与掌握新知识之间的权衡问题。对三个基准数据集上提出的配置和模块的全面评估进行了验证,以验证有关三个评估指标的有效性。
Few-Shot Class-Incremental Learning (FSCIL) aims at incrementally learning novel classes from a few labeled samples by avoiding the overfitting and catastrophic forgetting simultaneously. The current protocol of FSCIL is built by mimicking the general class-incremental learning setting, while it is not totally appropriate due to the different data configuration, i.e., novel classes are all in the limited data regime. In this paper, we rethink the configuration of FSCIL with the open-set hypothesis by reserving the possibility in the first session for incoming categories. To assign better performances on both close-set and open-set recognition to the model, Hyperbolic Reciprocal Point Learning module (Hyper-RPL) is built on Reciprocal Point Learning (RPL) with hyperbolic neural networks. Besides, for learning novel categories from limited labeled data, we incorporate a hyperbolic metric learning (Hyper-Metric) module into the distillation-based framework to alleviate the overfitting issue and better handle the trade-off issue between the preservation of old knowledge and the acquisition of new knowledge. The comprehensive assessments of the proposed configuration and modules on three benchmark datasets are executed to validate the effectiveness concerning three evaluation indicators.