论文标题

先前词典加速物理信息的神经网络培训

Accelerating Physics-Informed Neural Network Training with Prior Dictionaries

论文作者

Peng, Wei, Zhou, Weien, Zhang, Jun, Yao, Wen

论文摘要

物理信息神经网络(PINN)可以被视为通用PDE求解器,但是在特定问题上训练PINN可能很慢,并且没有理论上保证相应的误差范围。在本手稿中,我们提出了一种称为先前字典的物理信息的神经网络(PD-PINNS)的变体。 PD-Pinns配备了任务依赖于任务的词典,可以在任务上具有增强的表示能力,这有助于捕获字典提供的功能,以便拟议的神经网络可以在培训过程中获得更快的融合。在各种数值模拟中,与现有的Pinn方法相比,将先前的词典组合起来可以显着提高收敛速度。就理论而言,我们获得了适用于PINNS和PD-PINN的误差界限,用于求解二阶的椭圆形部分微分方程。事实证明,在某些温和条件下,神经网络的预测误差可能会受到PDE和边界条件的预期损失的界定。

Physics-Informed Neural Networks (PINNs) can be regarded as general-purpose PDE solvers, but it might be slow to train PINNs on particular problems, and there is no theoretical guarantee of corresponding error bounds. In this manuscript, we propose a variant called Prior Dictionary based Physics-Informed Neural Networks (PD-PINNs). Equipped with task-dependent dictionaries, PD-PINNs enjoy enhanced representation power on the tasks, which helps to capture features provided by dictionaries so that the proposed neural networks can achieve faster convergence in the process of training. In various numerical simulations, compared with existing PINN methods, combining prior dictionaries can significantly enhance convergence speed. In terms of theory, we obtain the error bounds applicable to PINNs and PD-PINNs for solving elliptic partial differential equations of second order. It is proved that under certain mild conditions, the prediction error made by neural networks can be bounded by expected loss of PDEs and boundary conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源