论文标题
通过双重可变学习率改善神经网络学习
Improving Neural Network Learning Through Dual Variable Learning Rates
论文作者
论文摘要
本文介绍并评估了一种针对神经网络的新型培训方法:双变量学习率(DVLR)。在行为心理学的洞察力的基础上,双重学习率被用来强调正确和不正确的反应,从而使对网络的反馈更具体。此外,学习率随着网络的性能而变化,从而使其更有效。 DVLR是在三种类型的网络上实施的:feedforward,卷积和残差以及两个域:MNIST和CIFAR-10。结果表明,精度始终提高,表明DVLR是一种训练神经网络模型的有前途的,心理动机的技术。
This paper introduces and evaluates a novel training method for neural networks: Dual Variable Learning Rates (DVLR). Building on insights from behavioral psychology, the dual learning rates are used to emphasize correct and incorrect responses differently, thereby making the feedback to the network more specific. Further, the learning rates are varied as a function of the network's performance, thereby making it more efficient. DVLR was implemented on three types of networks: feedforward, convolutional, and residual, and two domains: MNIST and CIFAR-10. The results suggest a consistently improved accuracy, demonstrating that DVLR is a promising, psychologically motivated technique for training neural network models.