论文标题
科学机器学习的尖峰神经操作员
Spiking Neural Operators for Scientific Machine Learning
论文作者
论文摘要
科学机器学习(SCIML)的主要计算任务是功能回归,既需要输入和模拟的输出。物理知识的神经网络(PINN)和神经操作员(例如DewSonet)在求解部分微分方程(PDE)方面非常有效,但它们对计算资源进行了大量征税,并且无法轻易用于边缘计算。在这里,我们通过考虑尖峰神经网络(SNN)来解决这个问题,这表明在将能源消耗降低两个或更多的数量级方面有希望。我们提出了一种基于SNN的方法来执行回归,这是一个挑战,这是由于表示函数的输入域和连续输出值作为尖峰的固有困难。我们首先提出了一种新方法,将连续值编码基于时空中的三角矩阵中的尖峰编码,并与现有方法相比证明其更高的性能。接下来,我们证明,使用一个简单的SNN体系结构,该体系结构由泄漏的集成和火(LIF)激活和两个密集的层组成,我们可以实现相对准确的功能回归结果。此外,我们可以用训练有素的多层感知器(MLP)网络替换LIF,并获得可比的结果,但要快三倍。然后,我们介绍了DePonet,该详细信息由输入的分支(通常是完全连接的神经网络,FNN)组成,并作为输出的中继线(也是FNN)。我们可以通过用SNN替换分支或树干来构建尖峰的Deeponet。我们证明了使用分支机构中的SNN进行分类的这种新方法,从而实现了与文献相当的结果。最后,我们通过用SNN替换其树干,设计了一个尖峰deponet,以进行回归,并获得近似功能以及推断微分方程解决方案的良好精度。
The main computational task of Scientific Machine Learning (SciML) is function regression, required both for inputs as well as outputs of a simulation. Physics-Informed Neural Networks (PINNs) and neural operators (such as DeepONet) have been very effective in solving Partial Differential Equations (PDEs), but they tax computational resources heavily and cannot be readily adopted for edge computing. Here, we address this issue by considering Spiking Neural Networks (SNNs), which have shown promise in reducing energy consumption by two orders of magnitude or more. We present a SNN-based method to perform regression, which has been a challenge due to the inherent difficulty in representing a function's input domain and continuous output values as spikes. We first propose a new method for encoding continuous values into spikes based on a triangular matrix in space and time, and demonstrate its better performance compared to the existing methods. Next, we demonstrate that using a simple SNN architecture consisting of Leaky Integrate and Fire (LIF) activation and two dense layers, we can achieve relatively accurate function regression results. Moreover, we can replace the LIF with a trained Multi-Layer Perceptron (MLP) network and obtain comparable results but three times faster. Then, we introduce the DeepONet, consisting of a branch (typically a Fully-connected Neural Network, FNN) for inputs and a trunk (also a FNN) for outputs. We can build a spiking DeepONet by either replacing the branch or the trunk by a SNN. We demonstrate this new approach for classification using the SNN in the branch, achieving results comparable to the literature. Finally, we design a spiking DeepONet for regression by replacing its trunk with a SNN, and achieve good accuracy for approximating functions as well as inferring solutions of differential equations.