论文标题

自信神经网络回归,并带有自举的深层合奏

Confident Neural Network Regression with Bootstrapped Deep Ensembles

论文作者

Sluijterman, Laurens, Cator, Eric, Heskes, Tom

论文摘要

随着神经网络的普及和使用的兴起,值得信赖的不确定性估计变得越来越重要。最突出的不确定性估计方法之一是深层集合(Lakshminarayanan等,2017)。经典参数模型在参数中具有不确定性,因为该模型为构建的数据是随机样本。现代神经网络具有额外的不确定性组件,因为网络的优化是随机的。 Lakshminarayanan等。 (2017年)指出,Deepembles并不包含有限数据效果引起的经典不确定性。在本文中,我们提出了用于回归设置的深层合奏的计算廉价扩展,称为“自举”深层集合,该集合设置明确地将有限数据的经典效果考虑到了使用Parametric Bootstrap的修改版本。我们通过一项实验研究表明,我们的方法在标准深层合奏上显着改善

With the rise of the popularity and usage of neural networks, trustworthy uncertainty estimation is becoming increasingly essential. One of the most prominent uncertainty estimation methods is Deep Ensembles (Lakshminarayanan et al., 2017) . A classical parametric model has uncertainty in the parameters due to the fact that the data on which the model is build is a random sample. A modern neural network has an additional uncertainty component since the optimization of the network is random. Lakshminarayanan et al. (2017) noted that Deep Ensembles do not incorporate the classical uncertainty induced by the effect of finite data. In this paper, we present a computationally cheap extension of Deep Ensembles for the regression setting, called Bootstrapped Deep Ensembles, that explicitly takes this classical effect of finite data into account using a modified version of the parametric bootstrap. We demonstrate through an experimental study that our method significantly improves upon standard Deep Ensembles

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源