论文标题

双重注意事项用于大pose脸部额叶

Dual-Attention GAN for Large-Pose Face Frontalization

论文作者

Yin, Yu, Jiang, Songyao, Robinson, Joseph P., Fu, Yun

论文摘要

面部额叶化为面部数据增强提供了一种有效而有效的方法,并在极端姿势方案中进一步提高了面部识别性能。尽管最近基于深度学习的面部合成方法取得了进步,但由于姿势的明显和照明差异,该问题仍然具有挑战性。在本文中,我们提出了一种新颖的双意见生成对抗网络(DA-GAN),以通过在GAN训练中捕获上下文依赖性和局部一致性,以实现光真逼真的面部额叶化。具体而言,引入了基于自发的发电机,以将局部特征与其长距离依赖性整合起来,从而产生更好的特征表示,因此产生的面孔可以更好地保留身份,尤其是对于更大的姿势角度。此外,采用了一种新颖的基于面部注意的歧视者来强调面部区域的当地特征,从而加强了合成额面的现实主义。在语义分割的指导下,使用了四个独立的歧视因子来区分面部的不同方面(\ ie皮肤,关键点,发际线和正面的面部)。通过分别在发电机和歧视器中引入这两种互补的注意机制,我们可以学习更丰富的特征表示,并生成标识,并保留了与最新艺术相比,具有更精细的细节(即更准确的面部外观和纹理)的额叶视图的推断。定量和定性实验结果证明了我们DA-GAN方法的有效性和效率。

Face frontalization provides an effective and efficient way for face data augmentation and further improves the face recognition performance in extreme pose scenario. Despite recent advances in deep learning-based face synthesis approaches, this problem is still challenging due to significant pose and illumination discrepancy. In this paper, we present a novel Dual-Attention Generative Adversarial Network (DA-GAN) for photo-realistic face frontalization by capturing both contextual dependencies and local consistency during GAN training. Specifically, a self-attention-based generator is introduced to integrate local features with their long-range dependencies yielding better feature representations, and hence generate faces that preserve identities better, especially for larger pose angles. Moreover, a novel face-attention-based discriminator is applied to emphasize local features of face regions, and hence reinforce the realism of synthetic frontal faces. Guided by semantic segmentation, four independent discriminators are used to distinguish between different aspects of a face (\ie skin, keypoints, hairline, and frontalized face). By introducing these two complementary attention mechanisms in generator and discriminator separately, we can learn a richer feature representation and generate identity preserving inference of frontal views with much finer details (i.e., more accurate facial appearance and textures) comparing to the state-of-the-art. Quantitative and qualitative experimental results demonstrate the effectiveness and efficiency of our DA-GAN approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源