论文标题

使用基于路径的规范的一般激活功能的神经网络的复杂度度量

Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms

论文作者

Li, Zhong, Ma, Chao, Wu, Lei

论文摘要

提出了一种简单的方法来获得具有一般激活功能的神经网络的复杂性控制。该方法是通过使用一维的RELU网络近似一般激活函数来激发该方法的,该函数将问题降低到Relu网络的复杂性控制中。具体而言,我们考虑了两层网络和深层剩余网络,为此得出基于路径的规范以控制复杂性。我们还提供了这些规范引起的函数空间的初步分析以及对相应正则化估计器的先验估计。

A simple approach is proposed to obtain complexity controls for neural networks with general activation functions. The approach is motivated by approximating the general activation functions with one-dimensional ReLU networks, which reduces the problem to the complexity controls of ReLU networks. Specifically, we consider two-layer networks and deep residual networks, for which path-based norms are derived to control complexities. We also provide preliminary analyses of the function spaces induced by these norms and a priori estimates of the corresponding regularized estimators.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源