论文标题

部分可观测时空混沌系统的无模型预测

Understanding Deep Learning via Decision Boundary

论文作者

Lei, Shiye, He, Fengxiang, Yuan, Yancheng, Tao, Dacheng

论文摘要

本文发现,具有较低决策边界(DB)的神经网络具有更好的概括性。提出了两个新的概念,即算法DB变异性和$(ε,η)$ - 数据DB变异性,以衡量算法和数据角度的决策边界变异性。广泛的实验显示了决策边界可变性与普遍性之间的显着负相关。从理论观点来看,提出了基于算法DB变异性的两个下限,并且不明确取决于样本量。我们还证明了$ \ mathcal {o} \ left(\ frac {1} {\ sqrt {m}}+ε+η\ log \ frac {1}η\ right)$的上限。无需标签就可以方便地进行估算,并且不明确取决于网络大小,而网络大小通常在深度学习中非常大。

This paper discovers that the neural network with lower decision boundary (DB) variability has better generalizability. Two new notions, algorithm DB variability and $(ε, η)$-data DB variability, are proposed to measure the decision boundary variability from the algorithm and data perspectives. Extensive experiments show significant negative correlations between the decision boundary variability and the generalizability. From the theoretical view, two lower bounds based on algorithm DB variability are proposed and do not explicitly depend on the sample size. We also prove an upper bound of order $\mathcal{O}\left(\frac{1}{\sqrt{m}}+ε+η\log\frac{1}η\right)$ based on data DB variability. The bound is convenient to estimate without the requirement of labels, and does not explicitly depend on the network size which is usually prohibitively large in deep learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源