论文标题
拉普拉斯和神经切线内核的经验分析
An Empirical Analysis of the Laplace and Neural Tangent Kernels
论文作者
论文摘要
神经切线内核是根据无限宽度神经网络的参数分布定义的内核函数。尽管该极限不切实际,但神经切线内核允许对神经网络和通过黑匣子的面纱进行更直接的研究。最近,从理论上讲,Laplace内核和神经切线内核在$ \ Mathbb {s}^{d-1} $中的空间中共享相同的再现核Hilbert空间,暗示了它们的等价。在这项工作中,我们分析了两个内核的实际等效性。我们首先是通过与核的匹配,然后通过与高斯过程的后代进行匹配来做到这一点。此外,我们在$ \ mathbb {r}^d $中分析了内核,并在回归的任务中对其进行了实验。
The neural tangent kernel is a kernel function defined over the parameter distribution of an infinite width neural network. Despite the impracticality of this limit, the neural tangent kernel has allowed for a more direct study of neural networks and a gaze through the veil of their black box. More recently, it has been shown theoretically that the Laplace kernel and neural tangent kernel share the same reproducing kernel Hilbert space in the space of $\mathbb{S}^{d-1}$ alluding to their equivalence. In this work, we analyze the practical equivalence of the two kernels. We first do so by matching the kernels exactly and then by matching posteriors of a Gaussian process. Moreover, we analyze the kernels in $\mathbb{R}^d$ and experiment with them in the task of regression.