论文标题
解耦的对比对比度学习,以进行自我监督的对抗性鲁棒性
Decoupled Adversarial Contrastive Learning for Self-supervised Adversarial Robustness
论文作者
论文摘要
针对强大表示学习和自我监督学习(SSL)的对抗性培训(AT)是无监督的代表学习学习的两个主动研究领域。在整合到SSL中,多个先前的工作已经完成了一项非常重要但具有挑战性的任务:学习强大的表示没有标签。使用广泛使用的框架是对抗性的对比学习,它是在SSL处和SSL的伴侣,因此构成了一个非常复杂的优化问题。受划分和争议哲学的启发,我们猜想它可以通过解决两个子问题来简化并改善:不持bust的SSL和伪审判。这种动机将任务的重点从寻求耦合问题的最佳集成策略转变为寻找子问题的子解决方案。话虽如此,这项工作丢弃了直接引入SSL框架的先前实践,并提出了一个两阶段的框架,称为脱钩的对抗性对比学习(DEACL)。广泛的实验结果表明,我们的DEACL实现了SOTA自制的对抗性鲁棒性,同时大大减少了训练时间,从而验证了其有效性和效率。此外,我们的DEACL构成了一个更可解释的解决方案,其成功也弥合了半监督的差距,以利用未标记的样本来实现强大的表示学习。该代码可在https://github.com/pantheon5100/deacl上公开访问。
Adversarial training (AT) for robust representation learning and self-supervised learning (SSL) for unsupervised representation learning are two active research fields. Integrating AT into SSL, multiple prior works have accomplished a highly significant yet challenging task: learning robust representation without labels. A widely used framework is adversarial contrastive learning which couples AT and SSL, and thus constitute a very complex optimization problem. Inspired by the divide-and-conquer philosophy, we conjecture that it might be simplified as well as improved by solving two sub-problems: non-robust SSL and pseudo-supervised AT. This motivation shifts the focus of the task from seeking an optimal integrating strategy for a coupled problem to finding sub-solutions for sub-problems. With this said, this work discards prior practices of directly introducing AT to SSL frameworks and proposed a two-stage framework termed Decoupled Adversarial Contrastive Learning (DeACL). Extensive experimental results demonstrate that our DeACL achieves SOTA self-supervised adversarial robustness while significantly reducing the training time, which validates its effectiveness and efficiency. Moreover, our DeACL constitutes a more explainable solution, and its success also bridges the gap with semi-supervised AT for exploiting unlabeled samples for robust representation learning. The code is publicly accessible at https://github.com/pantheon5100/DeACL.