论文标题
自适应跨层注意图像修复
Adaptive Cross-Layer Attention for Image Restoration
论文作者
论文摘要
事实证明,非本地注意模块对于图像恢复至关重要。传统的非本地注意力过程分别是每一层的特征,因此它可能会在不同层之间缺少特征之间存在相关性。为了解决这个问题,我们旨在设计关注模块,以汇总不同层的信息。鼓励每个查询像素在同一层中找到相关的密钥像素,而是鼓励每个查询像素在网络的多个以前的层上访问密钥像素。为了有效地将这种注意力设计嵌入神经网络骨架中,我们提出了一种新型的自适应跨层注意力(ACLA)模块。为ACLA提出了两种自适应设计:(1)自适应地选择每一层非本地注意的键; (2)自动搜索ACLA模块的插入位置。通过这两种自适应设计,ACLA动态选择了一个灵活数量的键,以在上一层中进行非本地关注,同时保持具有令人信服的性能的紧凑神经网络。关于图像恢复任务的广泛实验,包括单图超分辨率,图像降解,图像示波器和图像压缩伪像减少,验证ACLA的有效性和效率。 ACLA的代码可在\ url {https://github.com/sdl-asu/acla}上获得。
Non-local attention module has been proven to be crucial for image restoration. Conventional non-local attention processes features of each layer separately, so it risks missing correlation between features among different layers. To address this problem, we aim to design attention modules that aggregate information from different layers. Instead of finding correlated key pixels within the same layer, each query pixel is encouraged to attend to key pixels at multiple previous layers of the network. In order to efficiently embed such attention design into neural network backbones, we propose a novel Adaptive Cross-Layer Attention (ACLA) module. Two adaptive designs are proposed for ACLA: (1) adaptively selecting the keys for non-local attention at each layer; (2) automatically searching for the insertion locations for ACLA modules. By these two adaptive designs, ACLA dynamically selects a flexible number of keys to be aggregated for non-local attention at previous layer while maintaining a compact neural network with compelling performance. Extensive experiments on image restoration tasks, including single image super-resolution, image denoising, image demosaicing, and image compression artifacts reduction, validate the effectiveness and efficiency of ACLA. The code of ACLA is available at \url{https://github.com/SDL-ASU/ACLA}.