论文标题

稳定的样品压缩方案:新应用和最佳SVM边距

Stable Sample Compression Schemes: New Applications and an Optimal SVM Margin Bound

论文作者

Hanneke, Steve, Kontorovich, Aryeh

论文摘要

我们根据稳定的样本压缩方案分析了一个监督学习算法的家族,从某种意义上说,从训练集中删除未为压缩组中选择的训练集并不会改变所得分类器。我们使用此技术来得出多种学习算法的各种新颖或改进的数据依赖性概括界。特别是,我们证明了SVM的新边距,从而删除了日志因子。新的界限证明是最佳的。这解决了关于SVM可实现的PAC边缘边缘界限的长期公开问​​题。

We analyze a family of supervised learning algorithms based on sample compression schemes that are stable, in the sense that removing points from the training set which were not selected for the compression set does not alter the resulting classifier. We use this technique to derive a variety of novel or improved data-dependent generalization bounds for several learning algorithms. In particular, we prove a new margin bound for SVM, removing a log factor. The new bound is provably optimal. This resolves a long-standing open question about the PAC margin bounds achievable by SVM.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源