论文标题

真实的图像恢复,通过结构保存互补性关注

Real Image Restoration via Structure-preserving Complementarity Attention

论文作者

Zhang, Yuanfan, Li, Gen, Sun, Lei

论文摘要

由于卷积神经网络在从大规模数据中学习可概括的图像先验方面表现良好,因此这些模型已被广泛用于图像DeNoise任务。但是,在复杂模型上,计算复杂性也急剧增加。在本文中,我们提出了一个新颖的轻巧互补注意模块,其中包括密度模块和一个稀疏的模块,该模块可以合作地挖掘特征互补学习的特征,以构建有效的轻量级体系结构。此外,为了减少因denoing而导致的细节丢失,本文构建了一个基于梯度的结构保护分支。 We utilize gradient-based branches to obtain additional structural priors for denoising, and make the model pay more attention to image geometric details through gradient loss optimization.Based on the above, we propose an efficiently Unet structured network with dual branch, the visual results show that can effectively preserve the structural details of the original image, we evaluate benchmarks including SIDD and DND, where SCANet achieves state-of-the-art performance in PSNR and SSIM显着降低了计算成本。

Since convolutional neural networks perform well in learning generalizable image priors from large-scale data, these models have been widely used in image denoising tasks. However, the computational complexity increases dramatically as well on complex model. In this paper, We propose a novel lightweight Complementary Attention Module, which includes a density module and a sparse module, which can cooperatively mine dense and sparse features for feature complementary learning to build an efficient lightweight architecture. Moreover, to reduce the loss of details caused by denoising, this paper constructs a gradient-based structure-preserving branch. We utilize gradient-based branches to obtain additional structural priors for denoising, and make the model pay more attention to image geometric details through gradient loss optimization.Based on the above, we propose an efficiently Unet structured network with dual branch, the visual results show that can effectively preserve the structural details of the original image, we evaluate benchmarks including SIDD and DND, where SCANet achieves state-of-the-art performance in PSNR and SSIM while significantly reducing computational cost.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源