论文标题

粉丝交易:面部动作单元检测的在线知识蒸馏

FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection

论文作者

Yang, Jing, Shen, Jie, Lin, Yiming, Hristov, Yordan, Pantic, Maja

论文摘要

由于其在面部行为分析中的重要性,面部动作部门(AU)检测吸引了研究界的越来越多的关注。为了利用在线知识蒸馏框架,我们提出了``fantrans''方法来检测。我们的模型由卷积和变压器块的混合网络组成,以学习Per-au特征,并与AU共发生。该模型使用预先培训的特征作为特征作为较小的变速箱,以实现的特征逐渐添加到较小的变速箱中。将其表示为多个AUS的块,我们在变压器块中提出了一个可学习的注意力,以了解不同AU的特征之间的相关性BP4D和DISFA数据集证明了提出的方法的有效性。

Due to its importance in facial behaviour analysis, facial action unit (AU) detection has attracted increasing attention from the research community. Leveraging the online knowledge distillation framework, we propose the ``FANTrans" method for AU detection. Our model consists of a hybrid network of convolution and transformer blocks to learn per-AU features and to model AU co-occurrences. The model uses a pre-trained face alignment network as the feature extractor. After further transformation by a small learnable add-on convolutional subnet, the per-AU features are fed into transformer blocks to enhance their representation. As multiple AUs often appear together, we propose a learnable attention drop mechanism in the transformer block to learn the correlation between the features for different AUs. We also design a classifier that predicts AU presence by considering all AUs' features, to explicitly capture label dependencies. Finally, we make the attempt of adapting online knowledge distillation in the training stage for this task, further improving the model's performance. Experiments on the BP4D and DISFA datasets demonstrating the effectiveness of proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源