论文标题

使用SAT求解器培训神经网络

Training Neural Networks using SAT solvers

论文作者

Sahoo, Subham S.

论文摘要

我们建议使用SAT求解器探索全局优化方法来训练神经网。深层神经网络在像图像识别,语音识别等任务中实现了伟大的壮举。它们的成功大部分可以归因于基于梯度的优化方法,这些方法可以很好地扩展到巨大的数据集,同时仍然提供解决方案,比其他任何现有方法更好。但是,存在诸如平价功能和快速傅立叶变换之类的学习问题,其中使用基于梯度的优化算法的神经网络无法正确捕获学习任务的基础结构。因此,探索全局优化方法是最大的兴趣,因为基于梯度的方法被卡在本地Optima中。在实验中,我们证明了算法对Adam Optiamiser在均等学习等某些任务中的有效性。但是,在MNIST数据集上的图像分类的情况下,我们的算法的性能并不令人满意。我们进一步讨论了训练数据集的大小和超参数设置的作用,以使SAT求解器可扩展。

We propose an algorithm to explore the global optimization method, using SAT solvers, for training a neural net. Deep Neural Networks have achieved great feats in tasks like-image recognition, speech recognition, etc. Much of their success can be attributed to the gradient-based optimisation methods, which scale well to huge datasets while still giving solutions, better than any other existing methods. However, there exist learning problems like the parity function and the Fast Fourier Transform, where a neural network using gradient-based optimisation algorithm can not capture the underlying structure of the learning task properly. Thus, exploring global optimisation methods is of utmost interest as the gradient-based methods get stuck in local optima. In the experiments, we demonstrate the effectiveness of our algorithm against the ADAM optimiser in certain tasks like parity learning. However, in the case of image classification on the MNIST Dataset, the performance of our algorithm was less than satisfactory. We further discuss the role of the size of the training dataset and the hyper-parameter settings in keeping things scalable for a SAT solver.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源