论文标题

SVM的指数收敛率的情况

A Case of Exponential Convergence Rates for SVM

论文作者

Cabannes, Vivien, Vigogna, Stefano

论文摘要

分类通常是入门机学习课中描述的第一个问题。历史上,Vapnik-Chervonenkis理论提供了分类保证。然而,这些保证是基于顽固的算法,这导致了分类中的替代方法的理论。替代方法提供的保证是基于校准不平等的,在某些边缘条件下,这些不平等现象被证明是高度优势,未能捕获指数收敛现象。对于平稳的替代物来说,这些“超级”快速率正在被充分理解,但是对于与著名的支持向量机相关的非平滑损失(例如铰链损失)的情况仍然模糊。在本文中,我们提出了一种简单的机制来获得快速收敛率,并研究了其对SVM的使用情况。特别是,我们表明SVM即使不假设硬tsybakov的边缘条件也可以表现出指数的收敛速率。

Classification is often the first problem described in introductory machine learning classes. Generalization guarantees of classification have historically been offered by Vapnik-Chervonenkis theory. Yet those guarantees are based on intractable algorithms, which has led to the theory of surrogate methods in classification. Guarantees offered by surrogate methods are based on calibration inequalities, which have been shown to be highly sub-optimal under some margin conditions, failing short to capture exponential convergence phenomena. Those "super" fast rates are becoming to be well understood for smooth surrogates, but the picture remains blurry for non-smooth losses such as the hinge loss, associated with the renowned support vector machines. In this paper, we present a simple mechanism to obtain fast convergence rates and we investigate its usage for SVM. In particular, we show that SVM can exhibit exponential convergence rates even without assuming the hard Tsybakov margin condition.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源