论文标题

无监督的交叉模式哈希的自适应结构相似性

Adaptive Structural Similarity Preserving for Unsupervised Cross Modal Hashing

论文作者

Li, Liang, Zheng, Baihua, Sun, Weiwei

论文摘要

跨模式哈希是多模式数据管理和应用的重要方法。现有的无监督跨模式哈希算法主要依赖于预训练模型中的数据特征来挖掘其相似性关系。但是,它们的优化目标基于原始单模式特征之间的静态指标,而无需在培训过程中进一步探索数据相关。此外,其中大多数主要集中在连续空间中成对实例之间的关联采矿和对齐方式,但忽略了语义哈希空间中包含的潜在结构相关性。在本文中,我们提出了一个无监督的哈希学习框架,即自适应结构相似性保护(ASSPH),以解决上述问题。首先,我们提出了一种自适应学习方案,具有有限的数据和培训批次,以丰富培训过程中未标记实例的语义相关性,同时确保培训过程的平稳收敛。其次,我们提出了一种不对称的结构语义表示学习方案。我们在语义重建和相关挖掘阶段基于图形邻接关系的结构语义指标引入了结构语义指标,同时将哈希空间中的结构语义与不对称的二进制优化过程对齐。最后,与现有作品相比,我们进行了广泛的实验,以验证我们的工作的增强。

Cross-modal hashing is an important approach for multimodal data management and application. Existing unsupervised cross-modal hashing algorithms mainly rely on data features in pre-trained models to mine their similarity relationships. However, their optimization objectives are based on the static metric between the original uni-modal features, without further exploring data correlations during the training. In addition, most of them mainly focus on association mining and alignment among pairwise instances in continuous space but ignore the latent structural correlations contained in the semantic hashing space. In this paper, we propose an unsupervised hash learning framework, namely Adaptive Structural Similarity Preservation Hashing (ASSPH), to solve the above problems. Firstly, we propose an adaptive learning scheme, with limited data and training batches, to enrich semantic correlations of unlabeled instances during the training process and meanwhile to ensure a smooth convergence of the training process. Secondly, we present an asymmetric structural semantic representation learning scheme. We introduce structural semantic metrics based on graph adjacency relations during the semantic reconstruction and correlation mining stage and meanwhile align the structure semantics in the hash space with an asymmetric binary optimization process. Finally, we conduct extensive experiments to validate the enhancements of our work in comparison with existing works.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源