论文标题
可学习的多相抽样,用于换档不变和模棱两可的卷积网络
Learnable Polyphase Sampling for Shift Invariant and Equivariant Convolutional Networks
论文作者
论文摘要
我们提出了可学习的多相采样(LPS),这是一对可学习的下/上采样层,可实现真正的转移和等效性的卷积网络。 LP可以从数据端到端训练,并概括现有的手工制作的下采样层。它广泛适用,因为它可以通过更换/上取样层来集成到任何卷积网络中。我们评估了图像分类和语义分割的LP。实验表明,LPS在性能和转移一致性方面都与现有方法相比或胜过现有方法。我们首次实现了语义分割(Pascal VOC)的真正移位等值,即100%移位一致性,超过了绝对3.3%的基线。
We propose learnable polyphase sampling (LPS), a pair of learnable down/upsampling layers that enable truly shift-invariant and equivariant convolutional networks. LPS can be trained end-to-end from data and generalizes existing handcrafted downsampling layers. It is widely applicable as it can be integrated into any convolutional network by replacing down/upsampling layers. We evaluate LPS on image classification and semantic segmentation. Experiments show that LPS is on-par with or outperforms existing methods in both performance and shift consistency. For the first time, we achieve true shift-equivariance on semantic segmentation (PASCAL VOC), i.e., 100% shift consistency, outperforming baselines by an absolute 3.3%.