论文标题

可扩展内核方法的随机Gegenbauer功能

Random Gegenbauer Features for Scalable Kernel Methods

论文作者

Han, Insu, Zandieh, Amir, Avron, Haim

论文摘要

我们提出了有效的随机特征,以近似我们称为广义Zonal核(GZK)的新的且丰富的内核函数。我们提出的GZK家族通过在其Gegenbauer系列的扩展中引入径向因素,并包括多种无处不在的核函数,包括Dot-Product-Product kernels和最近引入的Neural neural neural neural neural contents kernels,将Zonal核(即单位球上的点产物内核)推广(即Dot-froptauct interion inter Sphere)。有趣的是,通过利用Gegenbauer多项式的繁殖特性,我们可以基于随机定向的Gegenbauer内核来为GZK家族构建有效的随机特征。我们证明了我们的Gegenbauer功能的子空间嵌入保证,可确保我们的功能可用于近似于解决学习问题,例如内核K-MEANS聚类,内核岭回归等。经验结果表明,我们所提出的功能优于最近的内核近似方法。

We propose efficient random features for approximating a new and rich class of kernel functions that we refer to as Generalized Zonal Kernels (GZK). Our proposed GZK family, generalizes the zonal kernels (i.e., dot-product kernels on the unit sphere) by introducing radial factors in their Gegenbauer series expansion, and includes a wide range of ubiquitous kernel functions such as the entirety of dot-product kernels as well as the Gaussian and the recently introduced Neural Tangent kernels. Interestingly, by exploiting the reproducing property of the Gegenbauer polynomials, we can construct efficient random features for the GZK family based on randomly oriented Gegenbauer kernels. We prove subspace embedding guarantees for our Gegenbauer features which ensures that our features can be used for approximately solving learning problems such as kernel k-means clustering, kernel ridge regression, etc. Empirical results show that our proposed features outperform recent kernel approximation methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源