论文标题
UCC:半监督语义分割的不确定性指导横向共同训练
UCC: Uncertainty guided Cross-head Co-training for Semi-Supervised Semantic Segmentation
论文作者
论文摘要
深度神经网络(DNNS)在语义细分方面取得了巨大的成功,这需要大量的标记数据进行培训。我们提出了一个新颖的学习框架,称为不确定性指导的跨头共同训练(UCC),用于半监督语义分割。我们的框架在共享编码器中引入了弱和强大的增强,以实现共同训练,这自然结合了一致性和自我训练的好处。每个细分头与同龄人相互作用,并且使用弱的增强结果用于监督强者。一致性培训样本的多样性可以通过动态跨集拷贝性纸 - DCSCP(DCSCP)来提高,这也减轻了分布不匹配和阶级不平衡问题。此外,我们提出的不确定性引导的重量重量模块(UGRM)通过通过建模不确定性来抑制从同伴的低质量伪标签的效果来增强自我训练的伪标签。关于城市景观和Pascal VOC 2012的广泛实验证明了我们UCC的有效性。我们的方法极大地胜过其他最先进的半监督语义分割方法。它在CityScapes和Pascal VOC 2012数据集中分别达到77.17 $ \%$,76.49 $ \%$ $ \%$ MIOU,该数据集在1/16的协议下,+10.1 $ \%$,+7.91 $ \%$ \%$比被监督的基线更好。
Deep neural networks (DNNs) have witnessed great successes in semantic segmentation, which requires a large number of labeled data for training. We present a novel learning framework called Uncertainty guided Cross-head Co-training (UCC) for semi-supervised semantic segmentation. Our framework introduces weak and strong augmentations within a shared encoder to achieve co-training, which naturally combines the benefits of consistency and self-training. Every segmentation head interacts with its peers and, the weak augmentation result is used for supervising the strong. The consistency training samples' diversity can be boosted by Dynamic Cross-Set Copy-Paste (DCSCP), which also alleviates the distribution mismatch and class imbalance problems. Moreover, our proposed Uncertainty Guided Re-weight Module (UGRM) enhances the self-training pseudo labels by suppressing the effect of the low-quality pseudo labels from its peer via modeling uncertainty. Extensive experiments on Cityscapes and PASCAL VOC 2012 demonstrate the effectiveness of our UCC. Our approach significantly outperforms other state-of-the-art semi-supervised semantic segmentation methods. It achieves 77.17$\%$, 76.49$\%$ mIoU on Cityscapes and PASCAL VOC 2012 datasets respectively under 1/16 protocols, which are +10.1$\%$, +7.91$\%$ better than the supervised baseline.