论文标题

蒸馏出语义分割的课堂间距离

Distilling Inter-Class Distance for Semantic Segmentation

论文作者

Zhang, Zhengbo, Zhou, Chunluan, Tu, Zhigang

论文摘要

在语义分段中广泛采用知识蒸馏以降低计算成本。先前的知识蒸馏方法,用于语义分割的侧重于像素的特征特征对齐和类内的特征变化蒸馏,忽略了以传递特征空间中类间距离的知识,这对于语义分割非常重要。为了解决这个问题,我们提出了一种类间距离蒸馏(IDD)方法,将特征空间中的类间距离从教师网络转移到学生网络。此外,语义分割是一项依赖位置的任务,因此我们利用位置信息蒸馏模块来帮助学生网络编码更多的位置信息。在三个受欢迎的数据集上进行了广泛的实验:CityScapes,Pascal VOC和ADE20K表明,我们的方法有助于提高语义细分模型的准确性并实现最先进的性能。例如。它在CityScapes数据集上的准确性将基准模型(“ PSPNET+RESNET18”)提高了7.50%。

Knowledge distillation is widely adopted in semantic segmentation to reduce the computation cost.The previous knowledge distillation methods for semantic segmentation focus on pixel-wise feature alignment and intra-class feature variation distillation, neglecting to transfer the knowledge of the inter-class distance in the feature space, which is important for semantic segmentation. To address this issue, we propose an Inter-class Distance Distillation (IDD) method to transfer the inter-class distance in the feature space from the teacher network to the student network. Furthermore, semantic segmentation is a position-dependent task,thus we exploit a position information distillation module to help the student network encode more position information. Extensive experiments on three popular datasets: Cityscapes, Pascal VOC and ADE20K show that our method is helpful to improve the accuracy of semantic segmentation models and achieves the state-of-the-art performance. E.g. it boosts the benchmark model("PSPNet+ResNet18") by 7.50% in accuracy on the Cityscapes dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源