论文标题

Signrelu神经网络及其近似能力

SignReLU neural network and its approximation ability

论文作者

Li, Jianfei, Feng, Han, Zhou, Ding-Xuan

论文摘要

近年来,深层神经网络(DNN)在科学和技术的各个领域都引起了极大的关注。激活功能定义了DNNS中的神经元如何为其处理传入信号。它们对于学习非线性转换以及在连续的神经元层中进行多种计算至关重要。在过去的几年中,研究人员研究了DNN解释其力量和成功的近似能力。在本文中,我们使用不同的激活函数(称为Signrelu)探索DNN的近似能力。我们的理论结果表明,在近似性能方面,Signrelu网络在近似性能方面的表现优于有理网络。进行数值实验,将Signrelu与Relu,Leay Relu和Elu等现有激活进行比较,这些实验说明了Signrelu的竞争实践表现。

Deep neural networks (DNNs) have garnered significant attention in various fields of science and technology in recent years. Activation functions define how neurons in DNNs process incoming signals for them. They are essential for learning non-linear transformations and for performing diverse computations among successive neuron layers. In the last few years, researchers have investigated the approximation ability of DNNs to explain their power and success. In this paper, we explore the approximation ability of DNNs using a different activation function, called SignReLU. Our theoretical results demonstrate that SignReLU networks outperform rational and ReLU networks in terms of approximation performance. Numerical experiments are conducted comparing SignReLU with the existing activations such as ReLU, Leaky ReLU, and ELU, which illustrate the competitive practical performance of SignReLU.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源