论文标题

深度度量学习的多元化相互学习

Diversified Mutual Learning for Deep Metric Learning

论文作者

Park, Wonpyo, Kim, Wonjae, You, Kihyun, Cho, Minsu

论文摘要

相互学习是一种合奏培训策略,可以通过将个体知识彼此转移,同时培训多个模型来改善概括。在这项工作中,我们提出了一种用于深度度量学习的有效共同学习方法,称为多样化的相互度量学习,从而增强了具有多样化的相互学习的嵌入模型。我们通过利用相互学习中的三种多样性来传递深度度量学习的关系知识:(1)从模型的不同初始化中模型多样性,(2)来自不同参数更新频率的时间多样性,以及(3)从不同输入的不同增强中观察多样性。在缺乏大规模数据的情况下,我们的方法特别足以用于归纳转移学习,其中嵌入模型是用验证模型初始化的,然后在目标数据集中进行微调。广泛的实验表明,我们的方法显着改善了各个模型及其集合。最后,提出的带有常规三重态损失的方法实现了标准数据集中的回忆@1的最先进性能:CUB-200-200-2011的69.9,在CARS-196上达到了89.1。

Mutual learning is an ensemble training strategy to improve generalization by transferring individual knowledge to each other while simultaneously training multiple models. In this work, we propose an effective mutual learning method for deep metric learning, called Diversified Mutual Metric Learning, which enhances embedding models with diversified mutual learning. We transfer relational knowledge for deep metric learning by leveraging three kinds of diversities in mutual learning: (1) model diversity from different initializations of models, (2) temporal diversity from different frequencies of parameter update, and (3) view diversity from different augmentations of inputs. Our method is particularly adequate for inductive transfer learning at the lack of large-scale data, where the embedding model is initialized with a pretrained model and then fine-tuned on a target dataset. Extensive experiments show that our method significantly improves individual models as well as their ensemble. Finally, the proposed method with a conventional triplet loss achieves the state-of-the-art performance of Recall@1 on standard datasets: 69.9 on CUB-200-2011 and 89.1 on CARS-196.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源