论文标题
神经形态部署的二进制通信网络中的超参数优化
Hyperparameter Optimization in Binary Communication Networks for Neuromorphic Deployment
论文作者
论文摘要
培训神经形态部署的神经网络是非平凡的。提出了多种方法来适应适合培训的反向传播或类似背部传播的算法。考虑到这些网络的性能特征通常与传统的神经网络具有不同的性能特征,因此通常不清楚如何设置网络拓扑或超参数以实现最佳性能。在这项工作中,我们介绍了一种贝叶斯方法,用于优化可以将二进制通信网络的算法的超参数部署到神经形态硬件中。我们表明,通过针对每个数据集优化此算法上的超参数,我们可以在每个数据集上的此算法(最高15%)上的先前最新算法来提高准确性。在将传统神经网络转换为适用于神经形态硬件的二进制通信时,这种性能的跃升继续强调潜力。
Training neural networks for neuromorphic deployment is non-trivial. There have been a variety of approaches proposed to adapt back-propagation or back-propagation-like algorithms appropriate for training. Considering that these networks often have very different performance characteristics than traditional neural networks, it is often unclear how to set either the network topology or the hyperparameters to achieve optimal performance. In this work, we introduce a Bayesian approach for optimizing the hyperparameters of an algorithm for training binary communication networks that can be deployed to neuromorphic hardware. We show that by optimizing the hyperparameters on this algorithm for each dataset, we can achieve improvements in accuracy over the previous state-of-the-art for this algorithm on each dataset (by up to 15 percent). This jump in performance continues to emphasize the potential when converting traditional neural networks to binary communication applicable to neuromorphic hardware.