论文标题
自我发作的神经功能袋
Self-Attention Neural Bag-of-Features
论文作者
论文摘要
在这项工作中,我们为多元序列数据提出了几种注意公式。我们以最近引入的第二次发言为基础,并通过基于自我注意力的潜在空间来量化特征/时间维度的相关性,而不是直接学习它们,从而重新制定了注意力学习方法。此外,我们提出了一种联合特征周期的注意机制,该机制学习一个联合2D注意掩码,突出相关信息而无需独立处理特征和时间表示。所提出的方法可用于各种体系结构,我们专门评估了它们的应用以及特征提取模块的神经袋。几个序列数据分析任务的实验表明,与标准方法相比,我们的方法所产生的性能提高。
In this work, we propose several attention formulations for multivariate sequence data. We build on top of the recently introduced 2D-Attention and reformulate the attention learning methodology by quantifying the relevance of feature/temporal dimensions through latent spaces based on self-attention rather than learning them directly. In addition, we propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information without treating feature and temporal representations independently. The proposed approaches can be used in various architectures and we specifically evaluate their application together with Neural Bag of Features feature extraction module. Experiments on several sequence data analysis tasks show the improved performance yielded by our approach compared to standard methods.