论文标题

密集:密集多任务学习的交叉任务注意机制

DenseMTL: Cross-task Attention Mechanism for Dense Multi-task Learning

论文作者

Lopes, Ivan, Vu, Tuan-Hung, de Charette, Raoul

论文摘要

多任务学习最近成为对复杂场景的全面理解的有前途解决方案。除了适当设计时,如果是记忆效率高,多任务模型,还可以促进跨任务的互补信号的交换。在这项工作中,我们共同解决了2D语义分割和三个与几何相关的任务:密集的深度估计,表面正常估计和边缘估计,证明了它们在室内和室外数据集上的好处。我们提出了一种新型的多任务学习体系结构,该体系结构通过相关引导的关注和自我注意力来利用成对的交叉任务交换,以增强所有任务的总体表示学习。我们对三个多任务设置进行了广泛的实验,显示了与合成和现实世界基准的竞争基线相比,我们的方法的优势。此外,我们将方法扩展到新型的多任务无监督域的适应设置。我们的代码可在https://github.com/cv-rits/densemtl上找到

Multi-task learning has recently emerged as a promising solution for a comprehensive understanding of complex scenes. In addition to being memory-efficient, multi-task models, when appropriately designed, can facilitate the exchange of complementary signals across tasks. In this work, we jointly address 2D semantic segmentation and three geometry-related tasks: dense depth estimation, surface normal estimation, and edge estimation, demonstrating their benefits on both indoor and outdoor datasets. We propose a novel multi-task learning architecture that leverages pairwise cross-task exchange through correlation-guided attention and self-attention to enhance the overall representation learning for all tasks. We conduct extensive experiments across three multi-task setups, showing the advantages of our approach compared to competitive baselines in both synthetic and real-world benchmarks. Additionally, we extend our method to the novel multi-task unsupervised domain adaptation setting. Our code is available at https://github.com/cv-rits/DenseMTL

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源