论文标题

引导深度度量学习

Guided Deep Metric Learning

论文作者

Gonzalez-Zapata, Jorge, Reyes-Amezcua, Ivan, Flores-Araiza, Daniel, Mendez-Ruiz, Mauricio, Ochoa-Ruiz, Gilberto, Mendez-Vazquez, Andres

论文摘要

深度度量学习(DML)方法已被证明与视觉相似性学习有关。但是,它们有时缺乏概括属性,因为经常使用不适当的样品选择策略进行训练,或者由于数据分布变化引起的数据集的难度。在尝试学习基础数据歧管时,这些代表了重要的缺点。因此,迫切需要开发更好的方法来获得基础歧管的概括和表示。在本文中,我们提出了一种新颖的DML方法,我们称之为引导性深度度量学习,这是一种以学习更多紧凑型群集为导向的新型架构,从而改善了DML分布转移的概括。这种新颖的体系结构由两个独立模型组成:一个多分支的主模型,灵感来自几个研究(FSL)的透视图,它基于标记数据的先验知识生成了缩小的假设空间,该假设空间指导或正式化了在离线知识蒸馏方案下培训培训期间学生模型的决策边界。实验表明,使用Musgrave等人建议的指南,该提出的方法能够更好地泛化和表示高达40%的改善(召回@1,CIFAR10)。进行更公平和现实的比较,文献中目前不存在

Deep Metric Learning (DML) methods have been proven relevant for visual similarity learning. However, they sometimes lack generalization properties because they are trained often using an inappropriate sample selection strategy or due to the difficulty of the dataset caused by a distributional shift in the data. These represent a significant drawback when attempting to learn the underlying data manifold. Therefore, there is a pressing need to develop better ways of obtaining generalization and representation of the underlying manifold. In this paper, we propose a novel approach to DML that we call Guided Deep Metric Learning, a novel architecture oriented to learning more compact clusters, improving generalization under distributional shifts in DML. This novel architecture consists of two independent models: A multi-branch master model, inspired from a Few-Shot Learning (FSL) perspective, generates a reduced hypothesis space based on prior knowledge from labeled data, which guides or regularizes the decision boundary of a student model during training under an offline knowledge distillation scheme. Experiments have shown that the proposed method is capable of a better manifold generalization and representation to up to 40% improvement (Recall@1, CIFAR10), using guidelines suggested by Musgrave et al. to perform a more fair and realistic comparison, which is currently absent in the literature

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源