论文标题

查询未标记的数据改善和鲁棒性的课堂学习学习

Queried Unlabeled Data Improves and Robustifies Class-Incremental Learning

论文作者

Chen, Tianlong, Liu, Sijia, Chang, Shiyu, Amini, Lisa, Wang, Zhangyang

论文摘要

班级学习(CIL)遭受了臭名昭著的困境,在学习新添加的课程和保留先前学习的课堂知识之间。通过存储重播的历史数据可以减轻灾难性的遗忘问题,这会导致内存开销以及预测更新。为了解决这一难题,我们建议在持续学习中利用“免费”外部未标记的数据查询。我们首先提出了一个带有查询的未标记数据(CIL-QUD)方案的CIL,其中我们只存储少数过去的训练样本作为锚点,并用它们来查询相关的无标记示例。除了新的和过去存储的数据外,通过学习 - 遗漏(LWF)的正规机构和类平衡培训,有效地利用了查询未标记的未标记。除了保留对过去和当前任务的模型概括之外,我们下一步研究了CIL-QUD的对抗性鲁棒性问题。受到未标记数据学习强大模型的成功的启发,我们探索了一种新的鲁棒性感知的CIL设置,在此设置中,随着新任务不断出现,学习的对手鲁棒性必须抵抗遗忘并被转移。尽管现有的选项很容易失败,但我们显示了查询的未标记数据可以继续受益,并无缝将CIL-QUD扩展到其可靠的版本RCIL-QUD中。广泛的实验表明,与以前的最新CIL方法相比,CIL-QUD在CIFAR-10和CIFAR-100上实现了可观的准确性。此外,RCIL-QUD确立了鲁棒性意识CIL的第一个强大里程碑。代码可在https://github.com/vita-group/cil-qud中找到。

Class-incremental learning (CIL) suffers from the notorious dilemma between learning newly added classes and preserving previously learned class knowledge. That catastrophic forgetting issue could be mitigated by storing historical data for replay, which yet would cause memory overheads as well as imbalanced prediction updates. To address this dilemma, we propose to leverage "free" external unlabeled data querying in continual learning. We first present a CIL with Queried Unlabeled Data (CIL-QUD) scheme, where we only store a handful of past training samples as anchors and use them to query relevant unlabeled examples each time. Along with new and past stored data, the queried unlabeled are effectively utilized, through learning-without-forgetting (LwF) regularizers and class-balance training. Besides preserving model generalization over past and current tasks, we next study the problem of adversarial robustness for CIL-QUD. Inspired by the recent success of learning robust models with unlabeled data, we explore a new robustness-aware CIL setting, where the learned adversarial robustness has to resist forgetting and be transferred as new tasks come in continually. While existing options easily fail, we show queried unlabeled data can continue to benefit, and seamlessly extend CIL-QUD into its robustified versions, RCIL-QUD. Extensive experiments demonstrate that CIL-QUD achieves substantial accuracy gains on CIFAR-10 and CIFAR-100, compared to previous state-of-the-art CIL approaches. Moreover, RCIL-QUD establishes the first strong milestone for robustness-aware CIL. Codes are available in https://github.com/VITA-Group/CIL-QUD.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源