论文标题
朝着神经网络确切的SDP验证的顺序框架
A Sequential Framework Towards an Exact SDP Verification of Neural Networks
论文作者
论文摘要
尽管近年来神经网络已应用于多个系统,但由于缺乏有效的技术来证明其鲁棒性,它们仍无法用于安全至关重要的系统。文献中已经提出了许多基于凸优化的技术来研究神经网络的鲁棒性,并且半决赛编程方法(SDP)方法已成为对神经网络稳健认证的领先竞争者。 SDP方法的主要挑战是它容易出现巨大的放松差距。在这项工作中,我们通过开发一个顺序框架来解决此问题,以通过分离的编程将非凸点削减添加到优化问题中,从而将此差距缩小到零。我们从理论和经验上分析了这种顺序SDP方法的性能,并表明它随着切割的数量的增加而弥合了间隙。
Although neural networks have been applied to several systems in recent years, they still cannot be used in safety-critical systems due to the lack of efficient techniques to certify their robustness. A number of techniques based on convex optimization have been proposed in the literature to study the robustness of neural networks, and the semidefinite programming (SDP) approach has emerged as a leading contender for the robust certification of neural networks. The major challenge to the SDP approach is that it is prone to a large relaxation gap. In this work, we address this issue by developing a sequential framework to shrink this gap to zero by adding non-convex cuts to the optimization problem via disjunctive programming. We analyze the performance of this sequential SDP method both theoretically and empirically, and show that it bridges the gap as the number of cuts increases.